Reader small image

You're reading from  Android Studio 4.1 Development Essentials – Java Edition

Product typeBook
Published inMay 2021
PublisherPackt
ISBN-139781801815161
Edition1st Edition
Right arrow
Author (1)
Neil Smyth
Neil Smyth
author image
Neil Smyth

Neil Smyth has over 25 years of experience in the IT industry, including roles in software development and enterprise-level UNIX and Linux system administration. In addition to a bachelor’s degree in information technology, he also holds A+, Security+, Network+, Project+, and Microsoft Certified Professional certifications and is a CIW Database Design Specialist. Neil is the co-founder and CEO of Payload Media, Inc. (a technical content publishing company), and the author of the Essentials range of programming and system administration books.
Read more about Neil Smyth

Right arrow

28. Detecting Common Gestures using the Android Gesture Detector Class

The term “gesture” is used to define a contiguous sequence of interactions between the touch screen and the user. A typical gesture begins at the point that the screen is first touched and ends when the last finger or pointing device leaves the display surface. When correctly harnessed, gestures can be implemented as a form of communication between user and application. Swiping motions to turn the pages of an eBook, or a pinching movement involving two touches to zoom in or out of an image are prime examples of the ways in which gestures can be used to interact with an application.

The Android SDK provides mechanisms for the detection of both common and custom gestures within an application. Common gestures involve interactions such as a tap, double tap, long press or a swiping motion in either a horizontal or a vertical direction (referred to in Android nomenclature as a fling).

The goal of...

28.1 Implementing Common Gesture Detection

When a user interacts with the display of an Android device, the onTouchEvent() method of the currently active application is called by the system and passed MotionEvent objects containing data about the user’s contact with the screen. This data can be interpreted to identify if the motion on the screen matches a common gesture such as a tap or a swipe. This can be achieved with very little programming effort by making use of the Android GestureDetectorCompat class. This class is designed specifically to receive motion event information from the application and to trigger method calls based on the type of common gesture, if any, detected.

The basic steps in detecting common gestures are as follows:

1. Declaration of a class which implements the GestureDetector.OnGestureListener interface including the required onFling(), onDown(), onScroll(), onShowPress(), onSingleTapUp() and onLongPress() callback methods. Note that this can...

28.2 Creating an Example Gesture Detection Project

The goal of this project is to detect the full range of common gestures currently supported by the GestureDetectorCompat class and to display status information to the user indicating the type of gesture that has been detected.

Select the Create New Project quick start option from the welcome screen and, within the resulting new project dialog, choose the Empty Activity template before clicking on the Next button.

Enter CommonGestures into the Name field and specify com.ebookfrenzy.commongestures as the package name. Before clicking on the Finish button, change the Minimum API level setting to API 26: Android 8.0 (Oreo) and the Language menu to Java.

Once the new project has been created, navigate to the app -> res -> layout -> activity_main.xml file in the Project tool window and double-click on it to load it into the Layout Editor tool.

Within the Layout Editor tool, select the “Hello, World!” TextView...

28.3 Creating the GestureDetectorCompat Instance

With the activity class now updated to implement the listener interfaces, the next step is to create an instance of the GestureDetectorCompat class. Since this only needs to be performed once at the point that the activity is created, the best place for this code is in the onCreate() method. Since we also want to detect double taps, the code also needs to call the setOnDoubleTapListener() method of the GestureDetectorCompat instance:

package com.ebookfrenzy.commongestures;

 

import androidx.appcompat.app.AppCompatActivity;

import android.os.Bundle;

import android.view.GestureDetector;

import android.widget.TextView;

import android.view.MotionEvent;

import androidx.core.view.GestureDetectorCompat;

 

public class MainActivity extends AppCompatActivity

        implements GestureDetector.OnGestureListener,

        ...

28.4 Implementing the onTouchEvent() Method

If the application were to be compiled and run at this point, nothing would happen if gestures were performed on the device display. This is because no code has been added to intercept touch events and to pass them through to the GestureDetectorCompat instance. In order to achieve this, it is necessary to override the onTouchEvent() method within the activity class and implement it such that it calls the onTouchEvent() method of the GestureDetectorCompat instance. Remaining in the MainActivity.java file, therefore, implement this method so that it reads as follows:

@Override

public boolean onTouchEvent(MotionEvent event) {

        this.gDetector.onTouchEvent(event);

        // Be sure to call the superclass implementation

        return super.onTouchEvent(event);

}

28.5 Testing the Application

Compile and run the application on either a physical Android device or an AVD emulator. Once launched, experiment with swipes, presses, scrolling motions and double and single taps. Note that the text view updates to reflect the events as illustrated in Figure 28-1:

Figure 28-1

28.6 Summary

Any physical contact between the user and the touch screen display of a device can be considered a “gesture”. Lacking the physical keyboard and mouse pointer of a traditional computer system, gestures are widely used as a method of interaction between user and application. While a gesture can be comprised of just about any sequence of motions, there is a widely used set of gestures with which users of touch screen devices have become familiar. A number of these so-called “common gestures” can be easily detected within an application by making use of the Android Gesture Detector classes. In this chapter, the use of this technique has been outlined both in theory and through the implementation of an example project.

Having covered common gestures in this chapter, the next chapter will look at detecting a wider range of gesture types including the ability to both design and detect your own gestures.

lock icon
The rest of the chapter is locked
You have been reading a chapter from
Android Studio 4.1 Development Essentials – Java Edition
Published in: May 2021Publisher: PacktISBN-13: 9781801815161
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
undefined
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at £13.99/month. Cancel anytime

Author (1)

author image
Neil Smyth

Neil Smyth has over 25 years of experience in the IT industry, including roles in software development and enterprise-level UNIX and Linux system administration. In addition to a bachelor’s degree in information technology, he also holds A+, Security+, Network+, Project+, and Microsoft Certified Professional certifications and is a CIW Database Design Specialist. Neil is the co-founder and CEO of Payload Media, Inc. (a technical content publishing company), and the author of the Essentials range of programming and system administration books.
Read more about Neil Smyth