Testing with Xtext and Xtend

Exclusive offer: get 50% off this eBook here
Implementing Domain-Specific Languages with Xtext and Xtend

Implementing Domain-Specific Languages with Xtext and Xtend — Save 50%

Learn how to implement a DSL with Xtext and Xtend using easy-to-understand examples and best practices with this book and ebook

$29.99    $15.00
by Lorenzo Bettini | August 2013 | Open Source

In this article by Lorenzo Bettini, author of the book Implementing Domain-Specific Languages with Xtext and Xtend, you will learn how to test a DSL implementation by using the Junit framework and the additional utility classes provided by Xtext. This way, your DSL implementation will have a suite of tests that can be run automatically. We will use the Entities DSL developed previously for showing the typical techniques for testing both the runtime and the UI features of a DSL implemented in Xtext.

(For more resources related to this topic, see here.)

Introduction to testing

Writing automated tests is a fundamental technology / methodology when developing software. It will help you write quality software where most aspects (possibly all aspects) are somehow verified in an automatic and continuous way. Although successful tests do not guarantee that the software is bug free, automated tests are a necessary condition for professional programming (see Beck 2002, Martin 2002, 2008, 2011 for some insightful reading about this subject).

Tests will also document your code, whether it is a framework, a library, or an application; tests are form of documentation that does not risk to get stale with respect to the implementation itself. Javadoc comments will likely not be kept in synchronization with the code they document, manuals will tend to become obsolete if not updated consistently, while tests will fail if they are not up-to-date.

The Test Driven Development (TDD) methodology fosters the writing of tests even before writing production code. When developing a DSL one can relax this methodology by not necessarily writing the tests first. However, one should write tests as soon as a new functionality is added to the DSL implementation. This must be taken into consideration right from the beginning, thus, you should not try to write the complete grammar of a DSL, but proceed gradually; write a few rules to parse a minimal program, and immediately write tests for parsing some test input programs. Only when these tests pass you should go on to implementing other parts of the grammar.

Moreover, if some validation rules can already be implemented with the current version of the DSL, you should write tests for the current validator checks as well.

Ideally, one does not have to run Eclipse to manually check whether the current implementation of the DSL works as expected. Using tests will then make the development much faster.

The number of tests will grow as the implementation grows, and tests should be executed each time you add a new feature or modify an existing one. You will see that since tests will run automatically, executing them over and over again will require no additional effort besides triggering their execution (think instead if you should manually check that what you added or modified did not break something).

This also means that you will not be scared to touch something in your implementation; after you did some changes, just run the whole test suite and check whether you broke something. If some tests fail,you will just need to check whether the failure is actually expected (and in case fix the test) or whether your modifications have to be fixed.

It is worth noting that using a version control system (such as Git) is essential to easily get back to a known state; just experimenting with your code and finding errors using tests does not mean you can easily backtrack.

You will not even be scared to port your implementation to a new version of the used frameworks. For example, when a new version of Xtext is released, it is likely that some API has changed and your DSL implementation might not be built anymore with the new version. Surely, running the MWE2 workflow is required. But after your sources compile again, your test suite will tell you whether the behavior of your DSL is still the same. In particular, if some of the tests fail, you can get an immediate idea of which parts need to be changed to conform to the new version of Xtext.

Moreover, if your implementation relies on a solid test suite, it will be easier for contributors to provide patches and enhancements for your DSL; they can run the test suite themselves or they can add further tests for a specific bugfix or for a new feature. It will also be easy for the main developers to decide whether to accept the contributions by running the tests.

Last but not the least, you will discover that writing tests right from the beginning will force you to write modular code (otherwise you will not be able to easily test it) and it will make programming much more fun.

Xtext and Xtend themselves are developed with a test driven approach.

Junit 4

Junit is the most popular unit test framework for Java and it is shipped with the Eclipse JDT. In particular, the examples in this article are based on Junit version 4.

To implement Junit tests, you just need to write a class with methods annotated with @org.junit.Test. We will call such methods simply test methods. Such Java (or Xtend) classes can then be executed in Eclipse using the "Junit test" launch configuration; all methods annotated with @Test will be then executed by Junit. In test methods you can use assert methods provided by Junit to implement a test. For example, assertEquals (expected, actual) checks whether the two arguments are equal; assertTrue(expression) checks whether the passed expression evaluates to true. If an assertion fails, Junit will record such failure; in particular, in Eclipse, the Junit view will provide you with a report about tests that failed. Ideally, no test should fail (and you should see the green bar in the Junit view).

All test methods can be executed by Junit in any order, thus, you should never write a test method which depends on another one; all test methods should be executable independently from each other.

If you annotate a method with @Before, that method will be executed before each test method in that class, thus, it can be used to prepare a common setup for all the test methods in that class. Similarly, a method annotated with @After will be executed after each test method (even if it fails), thus, it can be used to cleanup the environment. A static method annotated with @BeforeClass will be executed only once before the start of all test methods (@AfterClass has the complementary intuitive functionality).

The ISetup interface

Running tests means we somehow need to bootstrap the environment to make it support EMF and Xtext in addition to the implementation of our DSL. This is done with a suitable implementation of ISetup. We need to configure things differently depending on how we want to run tests; with or without Eclipse and with or without Eclipse UI being present. The way to set up the environment is quite different when Eclipse is present, since many services are shared and already part of the Eclipse environment. When setting up the environment for non-Eclipse use (also referred to as standalone) there are a few things that must be configured, such as creating a Guice injector and registering information required by EMF. The method createInjectorAndDoEMFRegistration in the ISetup interface is there to do exactly this.

Besides the creation of an Injector, this method also performs all the initialization of EMF global registries so that after the invocation of that method, the EMF API to load and store models of your language can be fully used, even without a running Eclipse. Xtext generates an implementation of this interface, named after your DSL, which can be found in the runtime plugin project. For our Entities DSL it is called EntitiesStandaloneSetup.

The name "standalone" expresses the fact that this class has to be used when running outside Eclipse. Thus, the preceding method must never be called when running inside Eclipse (otherwise the EMF registries will become inconsistent).

In a plain Java application the typical steps to set up the DSL (for example, our Entities DSL) can be sketched as follows:

Injector injector = new EntitiesStandaloneSetup().
createInjectorAndDoEMFRegistration();
XtextResourceSet resourceSet =
injector.getInstance(XtextResourceSet.class);
resourceSet.addLoadOption
(XtextResource.OPTION_RESOLVE_ALL, Boolean.TRUE);
Resource resource = resourceSet.getResource
(URI.createURI("/path/to/my.entities"), true);
Model model = (Model) resource.getContents().get(0);

This standalone setup class is especially useful also for Junit tests that can then be run without an Eclipse instance. This will speed up the execution of tests. Of course, in such tests you will not be able to test UI features.

As we will see in this article, Xtext provides many utility classes for testing which do not require us to set up the runtime environment explicitly. However, it is important to know about the existence of the setup class in case you either need to tweak the generated standalone compiler or you need to set up the environment in a specific way for unit tests.

Implementing tests for your DSL

Xtext highly fosters using unit tests, and this is reflected by the fact that, by default, the MWE2 workflow generates a specific plug-in project for testing your DSL. In fact, usually tests should reside in a separate project, since they should not be deployed as part of your DSL implementation. This additional project ends with the .tests suffix, thus, for our Entities DSL, it is org.example.entities.tests. The tests plug-in project has the needed dependencies on the required Xtext utility bundles for testing.

We will use Xtend to write Junit tests.

In the src-gen directory of the tests project, you will find the injector p roviders for both headless and UI tests. You can use these providers to easily write Junit test classes without having to worry about the injection mechanisms setup. The Junit tests that use the injector provider will typically have the following shape (using the Entities DSL as an example):

@RunWith(typeof(XtextRunner))
@InjectWith(typeof(EntitiesInjectorProvider))
class MyTest {
@Inject MyClass
...

As hinted in the preceding code, in this class you can rely on injection; we used @InjectWith and declared that EntitiesInjectorProvider has to be used to create the injector. EntitiesInjectorProvider will transparently provide the correct configuration for a standalone environment. As we will see later in this article, when we want to test UI features, we will use EntitiesUiInjectorProvider (note the "Ui" in the name).

Testing the parser

The first tests you might want to write are the ones which concern parsing.

This reflects the fact that the grammar is the first thing you must write when implementing a DSL. You should not try to write the complete grammar before starting testing: you should write only a few rules and soon write tests to check if those rules actually parse an input test program as you expect.

The nice thing is that you do not have to store the test input in a file (though you could do that); the input to pass to the parser can be a string, and since we use Xtend, we can use multi-line strings.

The Xtext test framework provides the class ParseHelper to easily parse a string. The injection mechanism will automatically tell this class to parse the input string with the parser of your DSL. To parse a string, we inject an instance of ParseHelper<T>, where T is the type of the root class in our DSL's model – in our Entities example, this class is called Model. The method ParseHelper.parse will return an instance of T after parsing the input string given to it.

By injecting the ParseHelper class as an extension, we can directly use its methods on the strings we want to parse. Thus, we can write:

@RunWith(typeof(XtextRunner))
@InjectWith(typeof(EntitiesInjectorProvider))
class EntitiesParserTest {

@Inject extension ParseHelper<Model>
@Test
def void testParsing() {
val model = '''
entity MyEntity {
MyEntity attribute;
}
'''.parse

val entity = model.entities.get(0)
Assert::assertEquals("MyEntity", entity.name)

val attribute = entity.attributes.get(0)
Assert::assertEquals("attribute", attribute.name);
Assert::assertEquals("MyEntity",
(attribute.type.elementType as EntityType).
entity.name);
}
...

In this test, we parse the input and test that the expected structure was constructed as a result of parsing. These tests do not add much value in the Entities DSL, but in a more complex DSL you do want to test that the structure of the parsed EMF model is as you expect.

You can now run the test: right-click on the Xtend file and select Run As | JUnit Test as shown in the following screenshot. The test should pass and you should see the green bar in the Junit view.

Note that the parse method returns an EMF model even if the input string contains syntax errors (it tries to parse as much as it can); thus, if you want to make sure that the input string is parsed without any syntax error, you have to check that explicitly. To do that, you can use another utility class, ValidationTestHelper. This class provides many assert methods that take an EObject argument. You can use an extension field and simply call assertNoErrors on the parsed EMF object. Alternatively, if you do not need the EMF object but you just need to check that there are no parsing errors, you can simply call it on the result of parse, for example:

class EntitiesParserTest {

@Inject extension ParseHelper<Model>
@Inject extension ValidationTestHelper
...
@Test
def void testCorrectParsing() {
'''
entity MyEntity {
MyEntity attribute
}
'''.parse.assertNoErrors
}

If you try to run the tests again, you will get a failure for this new test, as shown in the following screenshot:

The reported error should be clear enough: we forgot to add the terminating ";" in our input program, thus we can fix it and run the test again; this time the green bar should be back.

You can now write other @Test methods for testing the various features of the DSL (see the sources of the examples). Depending on the complexity of your DSL you may have to write many of them.

Tests should test one specific thing at a time; lumping things together (to reduce the overhead of having to write many test methods) usually makes it harder later.

Remember that you should follow this methodology while implementing your DSL, not after having implemented all of it. If you follow this strictly, you will not have to launch Eclipse to manually check that you implemented a feature correctly, and you will note that this methodology will let you program really fast.

Ideally, you should start with the grammar with a single rule, especially if the grammar contains nonstandard terminals. The very first task is to write a grammar that just parses all terminals. Write a test for that to ensure there are no overlapping terminals before proceeding; this is not needed if terminals are not added to the standard terminals. After that add as few rules as possible in each round of development/testing until the grammar is complete.

Testing the validator

Earlier we used the ValidationTestHelper class to test that it was possible to parse without errors. Of course, we also need to test that errors and warnings are detected. In particular, we should test any error situation handled by our own validator. The ValidationTestHelper class contains utility methods (besides assertNoErrors) that allow us to test whether the expected errors are correctly issued.

For instance, for our Entities DSL, we wrote a custom validator method that checks that the entity hierarchy is acyclic. Thus, we should write a test that, given an input program with a cycle in the hierarchy, checks that such an error is indeed raised during validation.

Although not strictly required, it is better to separate Junit test classes according to the tested features, thus, we write another Junit class, EntitiesValidatorTest, which contains tests related to validation. The start of this new Junit test class should look familiar:

@RunWith(typeof(XtextRunner))
@InjectWith(typeof(EntitiesInjectorProvider))
class EntitiesValidatorTest {

@Inject extension ParseHelper<Model>
@Inject extension ValidationTestHelper
...

We are now going to use the assertError method from ValidationTestHelper, which, besides the EMF model element to validate, requires the following arguments:

  • EClass of the object which contains the error (which is usually retrieved through the EMF EPackage class generated when running the MWE2 workflow)
  • The expected Issue Code
  • An optional string describing the expected error message

Thus, we parse input containing an entity extending itself and we pass the arguments to assertError according to the error generated by checkNoCycleInEntityHierarchy in EntitiesValidator:

@Test
def void testEntityExtendsItself() {
'''
entity MyEntity extends MyEntity {

}
'''.parse.assertError(EntitiesPackage::eINSTANCE.entity,
EntitiesValidator::HIERARCHY_CYCLE,
"cycle in hierarchy of entity 'MyEntity'"
)
}

Note that the EObject argument is the one returned by the parse method (we use assertError as an extension method). Since the error concerns an Entity object, we specify the corresponding EClass (retrieved using EntitiesPackage), the expected Issue Code, and finally, the expected error message. This test should pass.

We can now write another test which tests the same validation error on a more complex input with a cycle in the hierarchy involving more than one entity; in this test we make sure that our validator issues an error for each of the entities involved in the hierarchy cycle:

@Test
def void testCycleInEntityHierarchy() {
val model = '''
entity A extends B {}
entity B extends C {}
entity C extends A {}
'''.parse

model.assertError(EntitiesPackage::eINSTANCE.entity,
EntitiesValidator::HIERARCHY_CYCLE,
"cycle in hierarchy of entity 'A'"
)
model.assertError(EntitiesPackage::eINSTANCE.entity,
EntitiesValidator::HIERARCHY_CYCLE,
"cycle in hierarchy of entity 'B'"
)

model.assertError(EntitiesPackage::eINSTANCE.entity,
EntitiesValidator::HIERARCHY_CYCLE,
"cycle in hierarchy of entity 'C'"
)
}

Note that this time we must store the parsed EMF model into a variable since we will call assertError many times.

We can also test that the NamesAreUniqueValidator method detects elements with the same name:

@Test
def void testDuplicateEntities() {
val model = '''
entity MyEntity {}

entity MyEntity {}
'''.parse

model.assertError(EntitiesPackage::eINSTANCE.entity,
null,
"Duplicate Entity 'MyEntity'"
)
}

In this case, we pass null for the issue argument, since no Issue Code is reported by NamesAreUniqueValidator.

Similarly, we can write a test where the input has two attributes with the same name:

@Test
def void testDuplicateAttributes() {
val model = '''
entity MyEntity {
MyEntity attribute;
MyEntity attribute;
}
'''.parse

model.assertError(EntitiesPackage::eINSTANCE.attribute,
null,
"Duplicate Attribute 'attribute'"
)
}

Note that in this test we pass the EClass corresponding to Attribute, since duplicate attributes are involved in the expected error.

Do not worry if it seems tricky to get the arguments for assertError right the first time; writing a test that fails the first time it is executed is expected in Test Driven Development. The error of the failing test should put you on the right track to specify the arguments correctly. However, by inspecting the error of the failing test, you must first make sure that the actual output is what you expected, otherwise something is wrong either with your test or with the implementation of the component that you are testing.

Testing the formatter

As we said in the previously, the formatter is also used in a non-UI environment (indeed, we implemented that in the runtime plug-in project), thus, we can test the formatter for our DSL with plain Junit tests. At the moment, there is no helper class in the Xtext framework for testing the formatter, thus we need to do some additional work to set up the tests for the formatter. This example will also provide some more details on Xtext and EMF, and it will introduce unit test methodologies that are useful in many testing scenarios where you need to test whether a string output is as you expect.

First of all, we create another Junit test class for testing the formatter; this time we do not need the helper for the validator; we will inject INodeModelFormatter as an extension field since this is the class internally used by Xtext to perform formatting.

One of the main principles of unit testing (which is also its main strength) is that you should test a single functionality in isolation. Thus, to test the formatter, we must not run a UI test that opens an Xtext editor on an input file and call the menu item which performs the formatting; we just need to test the class to which the formatting is delegated and we do not need a running Eclipse for that.

import static extension org.junit.Assert.*
@RunWith(typeof(XtextRunner))
@InjectWith(typeof(EntitiesInjectorProvider))
class EntitiesFormatterTest {

@Inject extension ParseHelper<Model>
@Inject extension INodeModelFormatter;

Note that we import all the static methods of the Junit Assert class as extension methods.

Then, we write the code that actually performs the formatting given an input string. Since we will write several tests for formatting, we isolate such code in a reusable method. This method is not annotated with @Test, thus it will not be automatically executed by Junit as a test method. This is the Xtend code that returns the formatted version of the input string:

(input.parse.eResource as XtextResource).parseResult.
rootNode.format(0, input.length).formattedText

The method ParseHelper.parse returns the EMF model object, and each EObject has a reference to the containing EMF resource; we know that this is actually XtextResource (a specialized version of an EMF resource). We retrieve the result of parsing, that is, an IParseResult object, from the resource. The result of parsing contains the node model; recall from, that the node model carries the syntactical information that is, offsets and spaces of the textual input. The root of the node model, ICompositeNode, can be passed to the formatter to get the formatted version (we can even specify to format only a part of the input program).

Now we can write a reusable method that takes an input char sequence and an expected char sequence and tests that the formatted version of the input program is equal to what we expect:

def void assertFormattedAs(CharSequence input,
CharSequence expected) {
expected.toString.assertEquals(
(input.parse.eResource as XtextResource).parseResult.
rootNode.format(0, input.length).formattedText)
}

The reason why we convert the expected char sequence into a string will be clear in a minute. Note the use of Assert.assertEquals as an extension method.

We can now write our first formatting test using our extension method assertFormattedAs:

@Test
def void testEntities() {
'''
entity E1 { } entity E2 {}
'''.assertFormattedAs(
'''...'''
)
}

Why did we specify "…" as the expected formatted output? Why did we not try to specify what we really expect as the formatted output? Well, we could have written the expected output, and probably we would have gotten it right on the first try, but why not simply make the test fail and see the actual output? We can then copy that in our test once we are convinced that it is correct. So let's run the test, and when it fails, the Junit view tells us what the actual result is, as shown in the following screenshot:

If you now double-click on the line showing the comparison failure in the Junit view, you will get a dialog showing a line by line comparison, as shown in the following screenshot:

You can verify that the actual output is correct, copy that, and paste it into your test as the expected output. The test will now succeed:

@Test
def void testEntities() {
'''
entity E1 { } entity E2 {}
'''.assertFormattedAs(
'''
entity E1 {
}
entity E2 {
}'''
)
}

We did not indent the expected output in the multi-line string since it is easy to paste it like that from the Junit dialog.

Using this technique you can easily write Junit tests that deal with comparisons. However, the "Result Comparison" dialog appears only if you pass String objects to assertEquals; that is why we converted the char sequence into a string in the implementation of assertFormattedAs.

We now add a test for testing the formatting of attributes; the final result will be:

@Test
def void testAttributes() {
'''
entity E1 { int i ; string s; boolean b ;}
'''.assertFormattedAs(
'''
entity E1 {
int i;
string s;
boolean b;
}'''
)
}

Summary

In this article we introduced unit testing for languages implemented with Xtext. Being able to test most of the DSL aspects without having to start an Eclipse environment really speeds up development.Test Driven Development is an important programming methodology that helps you make your implementations more modular, more reliable, and resilient to changes of the libraries used by your code.

Resources for Article:


Further resources on this subject:


Implementing Domain-Specific Languages with Xtext and Xtend Learn how to implement a DSL with Xtext and Xtend using easy-to-understand examples and best practices with this book and ebook
Published: August 2013
eBook Price: $29.99
Book Price: $49.99
See more
Select your format and quantity:

About the Author :


Lorenzo Bettini

Lorenzo Bettini is an assistant professor (Researcher) in computer science at Dipartimento di Informatica, Università di Torino, Italy. Previously, he was a Postdoc and a contractual researcher at Dipartimento di Sistemi e Informatica, Università di Firenze, Italy.

He has a Masters Degree in computer science and a PhD in theoretical computer science.

His research interests cover design, theory, and the implementation of programming languages (in particular, object-oriented languages and network-aware languages).

He has been using Xtext since version 0.7. He has used Xtext and Xtend for implementing many Domain Specific Languages and Java-like programming languages.

He is also the author of about 60 papers published in international conferences and international journals.

You can contact him at http://www.lorenzobettini.it.

Books From Packt


Corona SDK Hotshot
Corona SDK Hotshot

Corona SDK application design
Corona SDK application design

 jQuery Game Development Essentials
jQuery Game Development Essentials

IBM Cognos Business Intelligence
IBM Cognos Business Intelligence

IBM Cognos Business Intelligence 10.1 Dashboarding Cookbook
IBM Cognos Business Intelligence 10.1 Dashboarding Cookbook

Microsoft Exchange 2010 PowerShell Cookbook
Microsoft Exchange 2010 PowerShell Cookbook

 Microsoft Exchange Server 2013 PowerShell Cookbook: Second Edition
Microsoft Exchange Server 2013 PowerShell Cookbook: Second Edition

HTML5 Game Development with ImpactJS
HTML5 Game Development with ImpactJS


No votes yet

Post new comment

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.
w
s
u
7
e
2
Enter the code without spaces and pay attention to upper/lower case.
Code Download and Errata
Packt Anytime, Anywhere
Register Books
Print Upgrades
eBook Downloads
Video Support
Contact Us
Awards Voting Nominations Previous Winners
Judges Open Source CMS Hall Of Fame CMS Most Promising Open Source Project Open Source E-Commerce Applications Open Source JavaScript Library Open Source Graphics Software
Resources
Open Source CMS Hall Of Fame CMS Most Promising Open Source Project Open Source E-Commerce Applications Open Source JavaScript Library Open Source Graphics Software