Reader small image

You're reading from  Android Sensor Programming By Example

Product typeBook
Published inApr 2016
Reading LevelBeginner
PublisherPackt
ISBN-139781785285509
Edition1st Edition
Languages
Tools
Right arrow
Author (1)
Varun Nagpal
Varun Nagpal
author image
Varun Nagpal

Varun Nagpal has been developing mobile apps since 2005 and has developed and contributed to more than 100 professional apps and games on various platforms, such as Android, iOS, Blackberry, and J2ME. Android app development has been his main area of expertise, and he has developed apps for a wide variety of Android devices, such as Android phones, tablets, watches, smart TVs, Android Auto, and Google Glass. He moved to Chicago in late 2013, and since then, he has become a seasoned mobile architect. He has worked in different roles (mobile architect, technical lead, senior developer, and technical consultant) for a variety of various global clients (Allstate, Verizon, AT&T, Sydbank Denmark, SiS Taiwan, Chams PLC Nigeria, and Nandos South Africa) in order to implement their mobile solutions. He has SCJP (Core Java) and SCWD (JSP and Servlets) certifications from Sun Microsystems and MCP (C#) and MCTS (ASP.NET) certifications from Microsoft. You can find his blogs on mobile technology and white papers written by him on his website at http://www.varunnagpal.com/. When he's not working, Varun can be found meditating or playing the flute. He also loves to develop meditation apps and fun games in his free time. He has developed about 40 meditation apps and games available on Google Play (https://play.google.com/store/apps/developer?id=Creative.Software.Studio) and the Apple App Store (https://itunes.apple.com/us/artist/creative-software-studio/id574745824) under the name of Creative Software Studio, his part-time start-up company (http://creativesoftwarestudio.com/).
Read more about Varun Nagpal

Right arrow

Chapter 2. Playing with Sensors

In this chapter, we will learn how to write our first sensor program. We will also understand the various callbacks, and how to use these callbacks in the foreground activity and background service. This chapter will also walk you through a basic algorithm developed using sensor values.

We will cover the following topics in this chapter:

  • Understanding various sensor framework callbacks
  • Using sensors in the foreground activity
  • Listing the available sensors on a device
  • Knowing individual sensors' capabilities
  • Getting the sensor values and updating the user interface
  • Monitoring sensor values in the background service

Understanding the sensor framework callbacks


The two most important callbacks of the sensor framework are the onSensorChanged() and onAccuracyChanged() methods. In order to write efficient sensor code, it's important to understand when these methods are called, and what processing we can do in them. These callbacks are methods of the SensorEventListnener interface, which needs to be implemented in the class where the callbacks are to be received:

onSensorChanged() is the first callback and has the following syntax:

@Override 
  public void onSensorChanged(SensorEvent event) { 
   } 

Depending on the type of reporting mode of the sensor, this method will be called, either at regular frequency (Continuous mode) or whenever there is a change in the value of the sensors from the previously reported value (On the change mode). The onSensorChanged() method provides the sensor values inside the float value[] array of the SensorEvent object. These sensor values are different from the...

Time for action - using sensors in the foreground activity


In this section, we will explore how to use sensors in the activity. This is the most basic and straightforward way of using sensors. Also, it's the most efficient way if your sensor functionality only ties to that activity:

  1. The first step is to implement our activity with the SensorEventListener interface so that our activity can receive SensorEvent through the onSensorChanged() method. The following code snippet shows the necessary import statements and the class declaration:

           import android.app.Activity; 
           import android.content.Context; 
           import android.hardware.Sensor; 
           import android.hardware.SensorEvent; 
           import android.hardware.SensorEventListener; 
           import android.hardware.SensorManager; 
           import android.os.Bundle; 
     
           public class SensorActivity extends Activity implements 
           SensorEventListener{ 
    
  2. Now, we will create the...

Time for action – listing the available sensors on a device


There are multiple sensors available on a device. In this section, we will learn how to get a list of all the available sensors. We will be populating the names of the available sensors in a list and will be displaying it on the screen using ListView.

  1. The following code block shows the declarations required by the activity. We don't need the SensorEventListener interface, as we will not be dealing with the values of the sensor. We declare ListView.ListAdapter, and SensorManager, along with the list of Sensor Objects to populate the list:

           public class SensorListActivity extends Activity 
           implements OnItemClickListener{ 
     
             private SensorManager mSensorManager; 
             private ListView mSensorListView; 
             private ListAdapter mListAdapter; 
             private List<Sensor> mSensorsList; 
    
  2. In the onCreate() method, we instantiate our SensorManager, ListView, and ListAdaptor...

Time for action – knowing individual sensors' capabilities


Android phones are manufactured by different OEMs, which use different vendors to get their sensors. It is very much possible that two different Android phones have different gyroscope sensors, which will have different ranges and other properties. Before developing a universal logic based on sensors, it's important to keep in mind sensor's individual properties and capabilities, which may vary from device to device. In this section, we will explore the common methods for finding out the properties and capabilities of a sensor:

  1. We will show the sensor properties in the individual TextView on the screen. In the following code snippet, the TextViewSensor, and SensorManager variables are declared:

          public class SensorCapabilityActivity extends Activity { 
     
            private SensorManager mSensorManager; 
            private int mSensorType; 
            private Sensor mSensor; 
            private TextView mSensorNameTextView...

Time for action – getting the sensor values and updating the user interface


Now, let's deal with the most important aspect of sensors, that is, playing with the sensor values. We have created a common activity and screen that can fit a vast number of values for all sensor types. Sensors can have varied values such as temperature or pressure; a light and proximity sensor may have only one value, while sensors such as magnetometer, accelerometer, gyroscope, linear acceleration, and gravity have the three values in the x, y, and z axes. There are other sensors that can have more than three values, for example, rotational vector, geomagnetic rotational vector, game rotational vector, and un-calibrated gyroscope. All the sensor values are passed in an array called values[], which is part of the SensorEvent object.

  1. We have created a generic common SensorValuesActivity to display all the values that are coming from the different sensors. We are using the length of values[]array to determine the...

Time for action – processing the sensor values in the background service


There will be cases where your app needs to listen and process the sensor values in the background. Such cases cannot be handled using the activity code structure. We need to use the Android service to handle background sensor processing. Let's discuss the background processing scenario with an example.

The phone handling algorithm

In our example, we will be playing a small MP3 sound when somebody picks up or handles the phone. We will call this event the phone handling event, and we will use the Android background service to continuously process the gyroscope sensor values to detect this phone-handling event. The gyroscope gives the rate of rotation of the phone in each xy, and axis. When your phone is kept still, there is no, or a very low rate of rotation reported as a value by gyroscope, but when the phone is picked up or handled, the value of rate of rotation goes very high. We will use this logic to define our...

What just happened?


We just created a basic sensor algorithm to detect the phone handling event by processing the gyroscope values in the background service. As a best practice, it's suggested that you don't not block the onSensorChanged() method. In our onSensorChanged() method callback, we are just doing very simple calculations that will be completed before the next callback arrives. If you have any doubt in about whether it's a simple calculation or a complex one, then the best way is to log the time before and after the calculation, and compare it with the time interval between the callbacks.

Summary


We looked at the sensor initialization cycle, important callbacks, and common ways of using the sensors in an activity and a background service. It is advisable to look at the sensors' properties and capabilities before writing a universal logic that will work with all types of sensor on different devices. We also learned about the sensor processing in the background and developed our first sensor algorithm.

In the next chapter, we will extend our understanding by developing a real-world application with the use of different types of sensor. We will also take a closer look at how we can use the sensor values to derive some more useful data.

lock icon
The rest of the chapter is locked
You have been reading a chapter from
Android Sensor Programming By Example
Published in: Apr 2016Publisher: PacktISBN-13: 9781785285509
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
undefined
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime

Author (1)

author image
Varun Nagpal

Varun Nagpal has been developing mobile apps since 2005 and has developed and contributed to more than 100 professional apps and games on various platforms, such as Android, iOS, Blackberry, and J2ME. Android app development has been his main area of expertise, and he has developed apps for a wide variety of Android devices, such as Android phones, tablets, watches, smart TVs, Android Auto, and Google Glass. He moved to Chicago in late 2013, and since then, he has become a seasoned mobile architect. He has worked in different roles (mobile architect, technical lead, senior developer, and technical consultant) for a variety of various global clients (Allstate, Verizon, AT&T, Sydbank Denmark, SiS Taiwan, Chams PLC Nigeria, and Nandos South Africa) in order to implement their mobile solutions. He has SCJP (Core Java) and SCWD (JSP and Servlets) certifications from Sun Microsystems and MCP (C#) and MCTS (ASP.NET) certifications from Microsoft. You can find his blogs on mobile technology and white papers written by him on his website at http://www.varunnagpal.com/. When he's not working, Varun can be found meditating or playing the flute. He also loves to develop meditation apps and fun games in his free time. He has developed about 40 meditation apps and games available on Google Play (https://play.google.com/store/apps/developer?id=Creative.Software.Studio) and the Apple App Store (https://itunes.apple.com/us/artist/creative-software-studio/id574745824) under the name of Creative Software Studio, his part-time start-up company (http://creativesoftwarestudio.com/).
Read more about Varun Nagpal