Android Application Testing: TDD and the Temperature Converter


Android Application Testing Guide

Android Application Testing Guide Build intensively tested and bug free Android applications
        Read more about this book      

(For more resources related to this subject, see here.)

Getting started with TDD

Briefly, Test Driven Development is the strategy of writing tests along the development process. These test cases are written in advance of the code that is supposed to satisfy them.

A single test is added, then the code needed to satisfy the compilation of this test and finally the full set of test cases is run to verify their results.

This contrasts with other approaches to the development process where the tests are written at the end when all the coding has been done.

Writing the tests in advance of the code that satisfies them has several advantages. First, is that the tests are written in one way or another, while if the tests are left till the end it is highly probable that they are never written. Second, developers take more responsibility for the quality of their work.

Design decisions are taken in single steps and finally the code satisfying the tests is improved by refactoring it.

This UML activity diagram depicts the Test Driven Development to help us understand the process:

Android Application Testing: TDD and the Temperature Converter

The following sections explain the individual activities depicted in this activity diagram.

Writing a test case

We start our development process with writing a test case. This apparently simple process will put some machinery to work inside our heads. After all, it is not possible to write some code, test it or not, if we don't have a clear understanding of the problem domain and its details. Usually, this step will get you face to face with the aspects of the problem you don't understand, and you need to grasp if you want to model and write the code.

Running all tests

Once the test is written the obvious following step is to run it, altogether with other tests we have written so far. Here, the importance of an IDE with built-in support of the testing environment is perhaps more evident than in other situations and this could cut the development time by a good fraction. It is expected that firstly, our test fails as we still haven't written any code!

To be able to complete our test, we usually write additional code and take design decisions. The additional code written is the minimum possible to get our test to compile. Consider here that not compiling is failing.

When we get the test to compile and run, if the test fails then we try to write the minimum amount of code necessary to make the test succeed. This may sound awkward at this point but the following code example in this article will help you understand the process.

Optionally, instead of running all tests again you can just run the newly added test first to save some time as sometimes running the tests on the emulator could be rather slow. Then run the whole test suite to verify that everything is still working properly. We don't want to add a new feature by breaking an existing one.

Refactoring the code

When the test succeeds, we refactor the code added to keep it tidy, clean, and minimal.

We run all the tests again, to verify that our refactoring has not broken anything and if the tests are again satisfied, and no more refactoring is needed we finish our task.

Running the tests after refactoring is an incredible safety net which has been put in place by this methodology. If we made a mistake refactoring an algorithm, extracting variables, introducing parameters, changing signatures or whatever your refactoring is composed of, this testing infrastructure will detect the problem. Furthermore, if some refactoring or optimization could not be valid for every possible case we can verify it for every case used by the application and expressed as a test case.

What is the advantage?

Personally, the main advantage I've seen so far is that you focus your destination quickly and is much difficult to divert implementing options in your software that will never be used. This implementation of unneeded features is a wasting of your precious development time and effort. And as you may already know, judiciously administering these resources may be the difference between successfully reaching the end of the project or not. Probably, Test Driven Development could not be indiscriminately applied to any project. I think that, as well as any other technique, you should use your judgment and expertise to recognize where it can be applied and where not. But keep this in mind: there are no silver bullets.

The other advantage is that you always have a safety net for your changes. Every time you change a piece of code, you can be absolutely sure that other parts of the system are not affected as long as there are tests verifying that the conditions haven't changed.

Understanding the requirements

To be able to write a test about any subject, we should first understand the Subject under test.

We also mentioned that one of the advantages is that you focus your destination quickly instead of revolving around the requirements.

Translating requirements into tests and cross referencing them is perhaps the best way to understand the requirements, and be sure that there is always an implementation and verification for all of them. Also, when the requirements change (something that is very frequent in software development projects), we can change the tests verifying these requirements and then change the implementation to be sure that everything was correctly understood and mapped to code.

Creating a sample project—the Temperature Converter

Our examples will revolve around an extremely simple Android sample project. It doesn't try to show all the fancy Android features but focuses on testing and gradually building the application from the test, applying the concepts learned before.

Let's pretend that we have received a list of requirements to develop an Android temperature converter application. Though oversimplified, we will be following the steps you normally would to develop such an application. However, in this case we will introduce the Test Driven Development techniques in the process.

The list of requirements

Most usual than not, the list of requirements is very vague and there is a high number of details not fully covered.

As an example, let's pretend that we receive this list from the project owner:

  • The application converts temperatures from Celsius to Fahrenheit and vice-versa
  • The user interface presents two fields to enter the temperatures, one for Celsius other for Fahrenheit
  • When one temperature is entered in one field the other one is automatically updated with the conversion
  • If there are errors, they should be displayed to the user, possibly using the same fields
  • Some space in the user interface should be reserved for the on screen keyboard to ease the application operation when several conversions are entered
  • Entry fields should start empty
  • Values entered are decimal values with two digits after the point
  • Digits are right aligned
  • Last entered values should be retained even after the application is paused

User interface concept design

Let's assume that we receive this conceptual user interface design from the User Interface Design team:

Android Application Testing: TDD and the Temperature Converter

Creating the projects

Our first step is to create the project. As we mentioned earlier, we are creating a main and a test project. The following screenshot shows the creation of the TemperatureConverter project (all values are typical Android project values):

Android Application Testing: TDD and the Temperature Converter

When you are ready to continue you should press the Next > button in order to create the related test project.

The creation of the test project is displayed in this screenshot. All values will be selected for you based on your previous entries:

Android Application Testing: TDD and the Temperature Converter

        Read more about this book      

(For more resources related to this subject, see here.)

Creating the TemperatureConverterActivityTests project

We only have some templates in our main project created by the Android ADT plugin, such as:

  • TemperatureConverterActivity
  • main.xml layout
  • strings.xml resources
  • Other resources, like icons

Additionally, we have some templates created in our test project. The corresponding test packages to keep our tests separated from the main package are:

  • main.xml layout
  • strings.xml resources
  • Other resources, like icons

Be very cautious and don't let the template files fool you. There's little or no use of these resources in the test project so to avoid confusion, you should delete them. If later on you discover that some tests require specific resources, you can add only the needed ones.

Proceed with creating the first test by selecting the main test package name com. in Eclipse's Package Explorer, and then right-click on it. Select New JUnit Test Case|.

You should have a dialog like this:

Android Application Testing: TDD and the Temperature Converter

Here, you need to enter the following:



New JUnit 3 test

JUnit 3 is the version supported by Android. Always use this option.

Source folder:

The default source folder for the tests. The default value should be fine.


The default package for the tests. This is usually the default package name for your main project followed by the subpackage test.


The name of the class for this test. The best practice here is to use the same class name of the class under test followed by the word Tests, in plural because most probably we will be hosting several tests in it.


We should select our superclass depending on what and how we are going to test.

In this particular case and because we are testing a single Activity and using the system infrastructure we use

ActivityInstrumentationTestCase2. Also note that as

ActivityInstrumentationTestCase2 is a generic class, we need the template parameter as well. This is the Activity under test which in our case is TemperatureConverterActivity.

We can ignore the warning indicating that the superclass does not exist for now; we will be fixing the imports soon.

Method stubs:

Select the method stubs you want created. If at this point you are not sure what you would need, then select them all, as default stubs will be invoking their super counterparts.

Do you want to add comments ?

Generates Javadoc comments for the stub test method.

Usually, unless you have changed the default template in Code Templates, the generated comments will be:


* Test method for {@link method()}.


Class under test:

This is the class we are testing-

TemperatureConverterActivity in this case. This is the most useful in other situations where the class under test has been implemented already and we would be able to select the list of methods we would like to test. Remember that in our case we haven't implemented the class yet so we will be presented with the only method that is in the Android ADT plugin template, which is onCreate.

This situation, where the class under test has not been implemented yet and only the method created by the Android ADT is available, is better understood pressing Next >. Here, the list of methods available to test is presented, and in our case we don't have any methods implemented yet other than onCreate and the inherited methods from Activity.

Android Application Testing: TDD and the Temperature Converter

This dialog has the following components:



Available methods:

This is the list of all the methods we may want to test.

When methods are overloaded, test names are generated accordingly to cope with the situation and parameter names are mangled into the test name.

Create final method stubs

Convenience set to add the final modifier to stub methods.

The final modifier prevents these methods from being overridden by a subclass.

Create tasks for generated test methods

Creates a TODO comment in the test case.

Either way, we may select onCreate(Bundle) to generate the testOnCreateBundle method for us, but we are leaving the selection list empty for now to avoid extra complexity of this simple demonstration application.

We now notice that our automatically generated class has some errors we need to fix before running. Otherwise the errors will prevent the test from running.

  • First we should add the missing imports, using the shortcut Shift+Ctrl+O.
  • Second, the problem we need to fix is related to the no-argument constructor. As this pattern dictates, we need to implement it:

    public TemperatureConverterActivityTests() {
    public TemperatureConverterActivityTests(String name) {

  • We added the no argument constructor TemperatureConverterActivityTests(). From this constructor, we invoke the constructor that takes a name as a parameter.
  • Finally, in this given name constructor, we invoke the super constructor and set the name.

To verify that everything has been set up in place, you may run the tests by using Run as | Android JUnit Test. There are no tests to run yet but at least we can verify that the infrastructure supporting our tests is already in place.

Creating the fixture

We can start creating our test fixture by populating the setUp method with the elements we need in our tests. Almost unavoidable, in this case, is the use of the Activity under test, so let's prepare for the situation and add it to the fixture:

protected void setUp() throws Exception {
mActivity = getActivity();

Let's create the mActivity field as well as the one proposed by Eclipse.

The ActivityInstrumentationTestCase2.getActivity() method has a side effect. If the Activity under test is not running, it is started. This may change the intention of a test if we use getActivity() as a simple accessor several times in a test and for some reason the Activity finishes or crashes before test completion. We will be inadvertently restarting the Activity, that is why in our tests we discourage the use of getActivity() in favor of having it in the fixture.

Test preconditions

We mentioned this before and this can be identified as another pattern. It's very useful to test all the preconditions and be sure that our fixture has been created correctly.

public final void testPreconditions() {

That is, let's check that our fixture is composed by "not null" values.

We can run the tests to verify that everything is correct and green as shown in this screenshot:

Android Application Testing: TDD and the Temperature Converter

Creating the user interface

Back to our Test Driven Development track, we need from our concise list of requirements that there be two entries for Celsius and Fahrenheit temperatures respectively. So let's add them to our test fixture.

They don't exist yet, and we haven't even started designing the user interface layout, but we know that there should be two entries like these for sure.

This is the code you should add to the setUp() method:

mCelsius = (EditText)
mFahrenheit = (EditText)

There are some important things to notice:

  • We define the fields for our fixture using EditText that we should import
  • We use previously created mActivity to find the Views by ID
  • We use the R class for the main project, not the one in the test project

Testing the existence of the user interface components

Once we have added them to the setUp() method, as indicated in the previous section, we can check their existence in a specific test:

public final void testHasInputFields() {

We are not able to run the tests yet because we must fix some compilation problems first. We should fix the missing IDs in the R class.

Having created our test fixture that references elements and IDs in the user interface that we don't have yet, it's mandated by the Test Driven Development paradigm that we add the needed code to satisfy our tests. The first thing we should do is get it to compile at least, so if we have some tests testing unimplemented features they will fail.

Getting the IDs defined

Our first stop would be to have the IDs for the user interface elements defined in the R class so the errors generated by referencing undefined constants com.example. and go away.

You, as an experienced Android developer, know how to do it. I'll give you a refresher anyway. Open the main.xml layout in the layout editor and add the required user interface components to get something that resembles the design previously introduced in the section User Interface concept design.

<?xml version="1.0" encoding="utf-8"?>
android:text="@string/message" />
android:text="@string/celsius" />
android:text="EditText" />
android:text="@string/fahrenheit" />
android:text="EditText" />

Doing so we get our tests to compile. Running them we get the following results:

  • testPreconditions succeeded
  • testHasInputFields succeeded
  • Everything is green now

This clearly means that we are on track with applying TDD.

You may also have noticed that we added some decorative and non functional items to our user interface that we are not testing, mainly to keep our example as simple as possible. In a real case scenario you may want to add tests for these elements too.

Translating requirements to tests

Tests have a double feature. They verify the correctness of our code but sometimes, and more prominently in TDD, they help us understand the design and digest what we are implementing. To be able to create the tests, we need to understand the problem we are dealing with and if we don't, we should at least have a rough understanding of the problem to allow us to handle it.

Many times, the requirements behind the user interface are not clearly expressed and you should be able to understand them from the schematic UI design. If we pretend that this is the case, then we can grasp it by writing our tests first.

Empty fields

From one of our requirements, we get: Entry fields should start empty.

To express this in a test we can write:

public final void testFieldsShouldStartEmpty() {
assertEquals("", mCelsius.getText().toString());
assertEquals("", mFahrenheit.getText().toString());

Here, we simply compare the initial contents of the fields against the empty string.

Not very surprisingly, we find that the test fails on execution. We forgot to clear the initial contents of the fields and they are not empty. Even though we haven't added any value to the android:text property of these fields, the ADT plugin layout editor adds some default values. Thus removing the default values from android:text="@~+id/EditText01" and android:text="@+id/EditText02" will force starting with empty temperature fields. These values may have been added by the ADT plugin itself or maybe by you when entering properties.

On running the test again, we find that it passes. We successfully converted one requirement to a test and validated it by obtaining the test results.

View properties

Identically, we can verify other properties of the Views composing our layout. Among other things we can verify:

  • Fields appear on the screen as expected
  • Font sizes
  • Margins
  • Screen alignment

Let's start verifying that the fields are on the screen:

public final void testFieldsOnScreen() {
final Window window = mActivity.getWindow();
final View origin = window.getDecorView();
assertOnScreen(origin, mCelsius);
assertOnScreen(origin, mFahrenheit);

As explained before, we use an assert form here: ViewAsserts: assertOnScreen.

The assertOnScreen method needs an origin to start looking for the other Views. In this case, because we want to start from the top most level, we use getDecorView(), which retrieves the top-level window decor view containing the standard window frame and decorations, and the client's content inside.

By running this test, we can ensure that the entry fields are on the screen as the UI design dictates. In some way we already knew that some Views with these specific IDs existed. That is, we made the fixture compile by adding the Views to the main layout, but we were not sure they were appearing on the screen at all. So, nothing else is needed but the sole presence of this test to ensure that the condition is not changed in the future. If we remove one of the fields for some reason, this test will tell us that it is missing and not complying with the UI design.

Following with our list of requirements, we should test that the Views are aligned in the layout as we expect:

public final void testAlignment() {
assertLeftAligned(mCelsiusLabel, mCelsius);
assertLeftAligned(mFahrenheitLabel, mFahrenheit);
assertLeftAligned(mCelsius, mFahrenheit);
assertRightAligned(mCelsius, mFahrenheit);

We continue using asserts from ViewAssert—in this case, assertLeftAligned and assertRightAligned. These methods verify the alignment of the specified Views.

The LinearLayout we are using by default arranges the fields in the way we are expecting them. Again, while we don't need to add anything to the layout, to satisfy the test, this will act as a guard condition.

Once we've verified that they are correctly aligned, we should verify that they are covering the whole screen width as specified by the schematic drawing. In this example, it's sufficient to verify the LayoutParams having the correct values:

public final void testCelsiusInputFieldCoverEntireScreen() {
final int expected = LayoutParams.MATCH_PARENT;
final LayoutParams lp = mCelsius.getLayoutParams();
assertEquals("mCelsius layout width is not
MATCH_PARENT", expected, lp.width);
public final void testFahrenheitInputFieldCoverEntireScreen() {
final int expected = LayoutParams.MATCH_PARENT;
final LayoutParams lp = mFahrenheit.getLayoutParams();
assertEquals("mFahrenheit layout width is not
MATCH_PARENT", expected, lp.width);

We used a custom message to easily identify the problem in case the test fails.

By running this test, we obtain the following message indicating that the test failed:

junit.framework.AssertionFailedError: mCelsius layout width
is not MATCH_
PARENT expected:<-1> but was:<-2>

This leads us to the layout definition. We must change layout_width to be match_ parent for Celsius and Fahrenheit fields:

<EditText android:layout_height="wrap_content"
android:id="@+id/celsius" android:layout_width="match_parent"

Same for Fahrenheit—after the change is done, we repeat the cycle and by running the test again, we can verify that it is now successful.

Our method is starting to appear. We create the test to verify a condition described in the requirements. If it's not met, we change the cause of the problem and running the tests again we verify that the latest change solves the problem, and what is perhaps more important is that the change doesn't break the exiting code.

Next, let's verify that font sizes are as defined in our requirements:

public final void testFontSizes() {
final float expected = 24.0f;
assertEquals(expected, mCelsiusLabel.getTextSize());
assertEquals(expected, mFahrenheitLabel.getTextSize());

Retrieving the font size used by the field is enough in this case.

The default font size is not 24px, so we need to add this to our layout. It's a good practice to add the corresponding dimension to a resource file and then use it where it's needed in the layout. So, let's add label_text_size to res/values/dimens. xml with a value of 24px. Then reference it in the Text size property of both labels, celsius_label and fahrenheit_label.

Now the test is passed.

Finally, let's verify that margins are interpreted as described in the user interface design:

public final void testMargins() {
LinearLayout.LayoutParams lp;
final int expected = 6;
lp = (LinearLayout.LayoutParams) mCelsius.getLayoutParams();
assertEquals(expected, lp.leftMargin);
assertEquals(expected, lp.rightMargin);
lp = (LinearLayout.LayoutParams) mFahrenheit.getLayoutParams();
assertEquals(expected, lp.leftMargin);
assertEquals(expected, lp.rightMargin);

This is a similar case as before. We need to add this to our layout. Let's add the margin dimension to the resource file and then use it where it's needed in the layout. Set the margin dimension in res/values/dimens.xml to a value of 6px. Then reference it in the Margin property of both fields, celsius and fahrenheit, and in the Left margin of the labels.

One more thing that is left is the verification of the justification of the entered values. We will validate input shortly to allow only the permitted values but for now let's just pay attention to the justification. The intention is to have values that are smaller than the whole field justified to the right and vertically centered:

public final void testJustification() {
final int expected = Gravity.RIGHT|Gravity.CENTER_VERTICAL;
int actual = mCelsius.getGravity();
assertEquals(String.format("Expected 0x%02x but was 0x%02x",
expected, actual), expected, actual);
actual = mFahrenheit.getGravity();
assertEquals(String.format("Expected 0x%02x but was 0x%02x",
expected, actual), expected, actual);

Here we verify the gravity values as usual. However, we are using a custom message to help us identify the values that could be wrong. As Gravity class defines several constants whose values are better identified if expressed in hexadecimal, we are converting the values to this base in the message.

If this test is failing due to the default gravity used for the fields, then what is only left is to change it. Go to the layout definition and alter these gravity values so that the test succeeds.

This is precisely what we need to add:


Screen layout

We now want to verify that the requirement specifying that enough screen space should be reserved to display the keyboard is actually fulfilled.

We can write a test like this:

public final void testVirtualKeyboardSpaceReserved() {
final int expected = 280;
final int actual = mFahrenheit.getBottom();
assertTrue(actual <= expected);

This verifies that the actual position of the last field in the screen, which is mFahrenheit, is not lower than a suggested value.

We can run the tests again verifying that everything is green again.


We presented Test Driven Development introducing its concepts and later on applying them step-by-step in a potential real-life problem.

We started with a concise list of requirement describing the Temperature Converter application.

In the next article we will take a look at adding some basic functionality to the user interface.

Further resources related to this subject:

You've been reading an excerpt of:

Android Application Testing Guide

Explore Title