A couple of years ago, I finally got to write this book on application test automation – one I had wanted to write for a long time already, as automated testing doesn't appear to be a love-at-first-sight topic for many. And, like testing in general, in many implementations, it tends to be subordinate to requirements specifications and application coding, be it a project or a product. And who really loves testing? It is not typically what the average developer gets enthusiastic about. Even functional consultants do not easily step up, for example, when raising this question during many of my workshops on automated testing.
So, when I started writing this book, inevitably, I did pose the question to myself: do I love testing? And I answered it with a yes. A big YES. This subsequently led to additional questions, such as: What makes me love testing? Did I always love testing? Why do I love it while the rest of the world seems not to? Questions with answers that made me go around the world evangelizing test automation, stubbornly sharing my findings, and pushing Microsoft to improve their tests to make them better and reusable. And all because I reckon that it is fun – BIG fun!
Having been an application tester in the former Dynamics NAV Global Development Localization (GDL) team at Microsoft, I surely got exposed to the testing virus. You could say that I had to learn to do the testing job, as I was paid for it. But it also suited me well, with me apparently having this specific DNA kind of a thing that makes testers what testers are. Having pride in breaking the thing and loving to prove its robustness (hopefully) at the same time. And, not in the least, daring to break it, running the risk that your developers will no longer like you.
One afternoon at Microsoft, my developer team member walked into our office and stopped next to my desk. Him being very tall and me sitting, I had to turn my head up to look at him.
"What's up?" I asked.
"Don't you like me anymore?" he responded.
Me: What?
Him: "Don't you like me anymore?"
Me: "Nope, still like you. Why?"
Him: "You rejected my code; don't you like me anymore?"
Me: "Dude, I still like you, but concerning your code, my tests show it somehow useless."
Testing is not rocket science, nor is automated testing. It's just another learnable skill. From a developer's perspective, however, it requires a change of mindset to write code with a totally different purpose than you are used to. And we all know that change is often not the easiest thing to achieve. Because of this, it's not unusual for attendees at my workshops to get to a certain level of frustration.
Application testing is a mindset, and it needs a big dose of discipline too – the discipline to do what needs to be done: to verify that the feature was built right; to verify that the feature under test meets the requirements – and the discipline when bugs are reported and fixed to execute the whole test run again and again with any new bug, to ensure that the verification is complete.
I tend to see myself as a disciplined professional. I have been quite a disciplined tester, with a high rate of bug reporting. But did I always love testing? You know, in those days, all our tests were executed manually, and with each bug I found, my discipline was challenged in some way. Imagine my mind running when executing the fifth test run after another bug fix. It's 4:00 P.M. and I am almost done, and the feature under test has to be delivered today. At breakfast, I promised my wife that I would be home on time, for whatever reason. Let's pick a random one: our wedding anniversary. So, me being a disciplined tester, my promise to be home on time, 4:00 P.M., and … I … hit … another … bug. Knowing that fixing it and rerunning the tests would take at least a couple of hours; how do you think my mind is running? Right: binary.
I had two options:
Had automated tests been in place, the choice would have been quite simple: the first option, resulting in no hassle at home and no hassle at work. Welcome to my world. Welcome to test automation!
In this chapter, we will discuss the following topics:
Note
If you prefer to read the what first, you might first want to jump to What is automated testing?
Plainly said: it all boils down to saving you a lot of hassle. There are no emotions or time-intensive execution that will keep you from (re)running tests. It's just a matter of pushing the button and getting the tests carried out. This is easily reproducible. Each time you push the button, the test will be executed exactly the same. It is fast to execute, as automated tests are "light-years" faster than any of us could do, and objective: a test fails or succeeds, and this will be part of the overall reporting. No human emotions that could hide it away.
If it was as straightforward as that, then why haven't we been doing this in the Dynamics 365 Business Central world all these years? You probably can come up with a relevant number of arguments yourself. Of which the most prominent might be we do not have time for that. Or maybe: who's going to pay for that?
Before elaborating on any arguments, pros or cons, let me create a more complete why not? list. Let's call it the whys of non-automated testing:
Dynamics 365 Business Central is not as simple as it used to be way back when it was called Navigator, Navision Financials, or Microsoft Business Solutions – Navision. And the world around us isn't as one-fold either. Our development practices are becoming more formal, and, with this, the call for testing automation is pressing on us for almost the same reasons as the whys of non-automated testing:
Let's check each of these aspects separately.
Regarding costs, I tend to say that a Dynamics 365 Business Central project goes 25% over budget on average, mainly due to bug fixing after go-live. Quite a number of attendees of my workshops, however, have tended to correct me, saying that this is on the low side. Whatever percentage is the case, the math is quite simple. If you're spending 25% extra at the end of the line, why not push it upstream and spend it during the development phase on automated testing, and meanwhile build up reusable collateral? Even more, this also ends up being a powerful multiplier as any defect solved earlier in the project chain, or what we also call upstream, is a factor cheaper than being solved later in the project, that is, downstream.
If my memory is not failing me, this factor could be as large as 1,000. During my time at Microsoft in the 2000s, research was performed on the cost of catching a bug in the different stages of developing a major release of a product. Catching a bug after the release was found to be approximately 1,000 times costlier than when catching the bug already at requirement specification.
Translating this to the world of an independent software vendor (ISV), this factor might roughly be 10 times lower. Meaning that the cost of catching a bug all the way downstream would be 100 times higher than all the way upstream. In the case of a value-added reseller (VAR) doing one-off projects, this factor probably is 100 times lower. So, whatever the factors would be, any spending upstream is more cost-effective than downstream, be it more formalized testing, better app coding, code inspection, or specifying more detailed requirements.
Note
A while ago, I wrote a joint blog post that also discussed this. You might want to read it:
https://www.fluxxus.nl/index.php/BC/from-a-testing-mindset-to-a-quality-assurance-based-mindset.
In all honesty, this is a no-brainer as this is the topic of this book. But it is worthwhile realizing that the testability framework inside the platform has been there ever since version 2009 SP1, released in the summer of 2009. So, for over 10 years, the platform has enabled us to build automated tests. Does it sound strange if I say that we have been sleeping for a long time? Sticking to old habits? At least most of us.
I agree that customers might know their features the best, and as such, they are the presumable testers. But can you rely 100% that testing isn't being squeezed between deadlines of implementation, and, in addition, between deadlines of their daily work? And still, in what way does their testing contribute to a more effective test effort in the future? How structured and reproducible will it be?
Posing these questions answers them already. It isn't a great idea, in general, to rely on customer testing if you want to improve your development practices. Having said that, this doesn't mean that customers should not be included; by all means, incorporate them into setting up your automated tests. We will elaborate on that more later.
At this very moment, all implementation partners in the Dynamics world are having a hard time finding people to add to their teams in order to get the work done. Note that I deliberately didn't use the adjective right in that sentence. We all are facing this challenge. And, even if human resources were abundant, practices show that, with a growing business in either volume or size, the number of resources used does not grow proportionally.
Consequently, we must all invest in changing our daily practices, which very often leads to automation, using, for example, PowerShell to automate administrative tasks and RapidStart methodology for configuring new companies. Likewise, writing automated tests to make your test efforts easier and faster. Indeed, it needs a certain investment to get it up and running, but it will save you time in the end.
Similar to getting the job done with proportionally fewer resources, test automation will eventually be of help in freeing up time for everyday business. This comes with an initial price but will pay off in due time.
Note
A logical question posed when I touch on the topic of spending time when introducing test automation concerns the ratio of time spent on application and test coding. Typically, in the Microsoft Dynamics 365 Business Central team, this is a 1:1 ratio. Meaning that, for each hour of application coding, there is 1 hour of test coding.
Traditional Dynamics 365 Business Central implementation partners typically have their hands full of customers with a one-off solution. And, due to this, they have dedicated teams or consultants taking care of these installations; testing is handled in close cooperation with the end user, with every test run putting a significant load on those involved. Imagine what it would mean to have automated test collateral in place – how you would be able to keep on servicing those one-off projects as your business expands.
On any major development platform, such as Visual Studio, it has been a common practice for a long time already that applications are delivered with test automation in place. Note that more and more customers are aware of these practices. And, more and more will ask you why you are not providing automated tests for their solution.
Each existing project is a threshold to take having a lot of functionality and no automated tests. In a lot of cases, a major part of the features used is standard Dynamics 365 Business Central functionality, and, for these, Microsoft has made their own tests available since version 11, that is, Dynamics NAV 2016. Altogether, over 33,000 tests have been made available for the latest version of Business Central. This is a humongous number, of which you might take advantage and make a relatively quick start on. We will discuss these tests and how you could get them running on any solution later.
Still not convinced why you would/should/could start using test automation? Do you need more arguments to sell it inside your company and to your customers?
Here are some more arguments:
Well, almost nobody. And, surely, when testing means rerunning and rerunning today, tomorrow, and next year, it tends to become a nuisance, which deteriorates the testing discipline. Automate tasks that bore people and free up time for more relevant testing where manual work makes a difference.
Having automated test collateral enables you to have quicker insight than ever before into the state of the code. And, at the same time, when building up this collection, the regression of bugs and the insertion of new ones will be much lower than ever before. This all leads to reduced risks and greater customer satisfaction, and your management will love this.
Learning this new skill of automating your tests will take a while, no doubt about that. But, once mastered, conceiving and creating them will often be much quicker than doing the thing manually as tests often are variations of each other. Copying and pasting with code is ... well ... can you do such a thing with manual testing? Not to mention to choose between re-running these automated tests or the manual tests.
Test automation is not just automating your test practice and speeding up some parts of your daily work, it's opening up new possibilities, as obvious as they can be. Just to mention a few:
Note
You might want to read my blog post for more examples of how your practice can be leveraged: https://www.opgona.training/index.php/2021/02/25/test-automation-its-not-a-buzz-word-but-a-grand-helping-hand/.
With iterative methodologies – such as Scrum and Agile – and cloud services, it has become a normal practice that updates are delivered with shorter time intervals, leaving even less time to get full manual testing done, by customers and dedicated resources in your team. Clearly, Dynamics 365 Business Central is a part of this story and your app too. As it will also be a part of the AppSource paradigm, where companies around the world use Business Central in the cloud and are able to basically install any app from AppSource and start using it without any formal relation to and instruction from the manufacturer. If the app appears to be buggy, then they probably will just as easily uninstall it and go for another. You cannot afford for your app to be put aside because your testing practice does not fit this reality. In the Software as a service (SaaS) world, automated testing is a must to guarantee a quality product.
In the first edition, I started this section with:
Last but not least, on this incomplete discussion of arguments on the whys of test automation, and perhaps the sole reason for you reading this book: automated tests are required by Microsoft when you are going to sell your Dynamics 365 Business Central extension on AppSource.
This is no longer the case as Microsoft does not require automated testing as part of the AppSource validation anymore. They, however, strongly recommend … using automated tests, hence the question mark added to the section heading.
But even though Microsoft isn't forcing us anymore, know that our customers will be requesting this from us more and more, seeing that your competitors do practice and provide automated testing. The latter giving your competitor the advantage of having more time to add features and release more often.
Note
Read Krzysztof Bialowas' blog post where he discusses this Microsoft policy change with a reference to the Microsoft communication on this:
http://www.mynavblog.com/2021/02/16/automated-tests-dont-listen-to-microsoft/.
Notwithstanding all the above, you might rightfully wonder whether test automation is a silver bullet that will solve everything. And I cannot deny that! However, I can tell you that, if exercised well, it will surely raise the quality of your development effort. As mentioned before, it has the following benefits:
Enough arguments to convince you why you would want to use automated tests, I guess. But how about when to use them? Ideally, this would be whenever code is changed to show that this functionality, which has already been tested, is still working as it should, to show that recent modifications have not compromised the existing application.
This sounds logical, but what does this mean when you have no automated tests in place? How do you go about starting to create your first ones? Basically, I would advise you to use the following criteria:
Using these two criteria, the following examples of code changes are typical candidates for your first efforts:
An after go-live bug reveals an omission in the initial test effort that can often be traced back to a flaw in the requirements. Frequently, it has a restricted scope, and, not the least important, a clear reproduction scenario. And by all means, such a bug should be prevented from ever showing its ugly face again.
You have this feature that keeps on troubling you and your customers. Bugs keep on popping up in this code and it never seems to stop. The most elementary thing you should start with is the after go-live bug fixing approach as previously discussed. But, even more importantly, use this code to get started on your first test suite.
Bugs are a particularly useful starting point because they usually provide the following:
These are three important elements of automated testing, as you will learn in the next few chapters.
One of the basic rules of good code governance is that code should only be changed when it is going to be tested. So, if the code is modified frequently, then the consequence is that it will also be tested frequently. Automating these tests will give a good return on investment for sure.
Thorough testing should always be the case, but, given the circumstances, it is unfortunately not always doable. Testing changes made to business-critical code, however, should always be exhaustive; that is, try to cover all scenarios of those processes and define a test for each one of them in your automated testing. You can simply not afford any failures on them. Make it a point of honor to find even the two to five percent of bugs that statistics tell us are always there!
Refactoring code can be nerve-racking: removing, rewriting, and reshuffling code. How do you know it is still doing the job it used to? Does it not break anything else? It certainly needs to be tested. But, when manually done, it is often executed after the whole refactoring is ready. That might already be too late, as too many pieces got broken. Before refactoring any code, grant yourself peace of mind and start by getting an automated test suite in place to prove its validity of the original code. To achieve this, define the business scenarios and create tests for each of those scenarios to prove that the current functionality works. With every refactor step you take, run the suite. And again, to show the code is still behaving the same. This way, refactoring becomes fun. We will elaborate on refactoring more later.
Starting from scratch and creating a new feature, writing both test and application code, will be an undeniable experience. For some, this might be the ultimate way to go. For others, this is a bridge too far, in which case, all previous candidates are probably better ones. In Section 3, Designing and Building Automated Tests for Microsoft Dynamics 365 Business Central, we will take this approach and show you the value of writing test code alongside application code.
Incorporating any update from Microsoft, be it on-premises or in the cloud, your features must be (re)tested to prove they're still functioning as before. In case you do not have automated tests in place, begin creating them. Do this based on the analysis of the various changes and the risks they might entail in terms of introducing errors.
Note
Working on test automation for a new feature will give you the best return on investment. It will lead to a better quality of the code and thus prevent a lot of bugs after go-live/release. But it might be too big a threshold when you start with test automation. Choose one of the above-suggested candidates to start your test automation battle.
We discussed why you might want to automate your tests and when to do this, and more specifically, where to start. But we didn't give any thought to what automated testing is. So, let's do that before we conclude this chapter.
With automated testing, we address the automation of application tests, that is, scripting manual application tests that check the validity of features. In our case, these are the features that reside in Dynamics 365 Business Central. You might have noticed that we have been using somewhat different terms for it:
These all mean the same thing!!!
On one hand, automated testing is replacing manual, often exploratory testing. It's replacing those manual tests that are repeatable and automatable, and often no fun (anymore) to execute.
On the other hand, they are complementary. Manual testing will still contribute to raising the quality of a feature, making use of creative and experienced human minds able to find holes in the current test design. Automated testing might also include so-called unit tests. These are tests that verify the working of atomic units that altogether make up a feature. Typically, these units would be single, global AL functions – units that would never be tested manually. Ultimately, both manual and automated tests serve the same goal: to verify that the object under test meets the requirements and doesn't do anything outside of the requirements.
Notes
(1) What is exploratory testing? Check out the following link for more information: https://en.wikipedia.org/wiki/Exploratory_testing.
(2) More on unit and functional tests can be found at the following link: https://www.softwaretestinghelp.com/the-difference-between-unit-integration-and-functional-testing/. We will also elaborate more on unit tests in this book.
Before we head off to the next chapter, I would like to add a couple of notes on automated tests that haven't been touched on so far.
There is no sense in denying: automated tests are also code. It takes time to design, implement, and maintain them. Like application code, any line of test code has a probability of containing a bug. The challenge is to keep this to a bare minimum. You can achieve this by:
And, please, like any source, put your tests under source code management. They're a part of your product. Make sure to reserve time for this in your planning.
Over time, your application changes and therefore the tests that go with it, be it manual or automated tests. Any change in the application, if well covered by test scenarios, will reflect in a number of failing tests as these no longer fit. Even a simple change in the order of the application code can lead to one or more tests falling over.
One of the reviewers of this book, Xavi Ametller Serrat, pointed out the difference between testing and checking. With testing, we investigate and learn, while checking is about confirming and verifying. The first is typically a human exercise, while the second is what we can automate. Given this insight, automated testing should rather be called automated checking. In this book, however, the first term will be used as this links more to a common notice.
Note
An interesting read on testing versus checking can be found here: https://www.developsense.com/blog/2009/08/testing-vs-checking/.
In this chapter, we discussed why you would want to automate your application tests, and when you might want to create and run them. And, we concluded with a short description of the what – what is automated testing?
In Chapter 2, Test Automation and Test-Driven Development, you will learn about the Test-Driven Development principles and how they can be applied to Dynamics 365 Business Central development that includes test automation.
Where there is an eBook version of a title available, you can buy it from the book details for that title. Add either the standalone eBook or the eBook and print book bundle to your shopping cart. Your eBook will show in your cart as a product on its own. After completing checkout and payment in the normal way, you will receive your receipt on the screen containing a link to a personalised PDF download file. This link will remain active for 30 days. You can download backup copies of the file by logging in to your account at any time.
If you already have Adobe reader installed, then clicking on the link will download and open the PDF file directly. If you don't, then save the PDF file on your machine and download the Reader to view it.
Please Note: Packt eBooks are non-returnable and non-refundable.
Packt eBook and Licensing When you buy an eBook from Packt Publishing, completing your purchase means you accept the terms of our licence agreement. Please read the full text of the agreement. In it we have tried to balance the need for the ebook to be usable for you the reader with our needs to protect the rights of us as Publishers and of our authors. In summary, the agreement says:
If you want to purchase a video course, eBook or Bundle (Print+eBook) please follow below steps:
Our eBooks are currently available in a variety of formats such as PDF and ePubs. In the future, this may well change with trends and development in technology, but please note that our PDFs are not Adobe eBook Reader format, which has greater restrictions on security.
You will need to use Adobe Reader v9 or later in order to read Packt's PDF eBooks.
Packt eBooks are a complete electronic version of the print edition, available in PDF and ePub formats. Every piece of content down to the page numbering is the same. Because we save the costs of printing and shipping the book to you, we are able to offer eBooks at a lower cost than print editions.
When you have purchased an eBook, simply login to your account and click on the link in Your Download Area. We recommend you saving the file to your hard drive before opening it.
For optimal viewing of our eBooks, we recommend you download and install the free Adobe Reader version 9.