Reader small image

You're reading from  Programming Microsoft Dynamics 365 Business Central - Sixth Edition

Product typeBook
Published inApr 2019
PublisherPackt
ISBN-139781789137798
Edition6th Edition
Right arrow
Authors (3):
Marije Brummel
Marije Brummel
author image
Marije Brummel

Author, programmer, consultant, project manager, presenter, evangelist, sales person, and a trainer. It's next to impossible to find someone as experienced as Marije Brummel in the Business Central community. Marije received the Microsoft MVP and the NAVUG All-Star awards among several others. She has chaired the Dynamics Credentialing committee and has authored official Microsoft Exam materials. She's the go-to girl for performance troubleshooting and upgrade challenges. One of her biggest achievements was introducing Design Patterns into the Business Central community. Her books, blog articles, and YouTube videos have influenced almost everyone involved with Business Central. She enjoys the outdoors with her dog and loves spending time with her family.
Read more about Marije Brummel

David Studebaker
David Studebaker
author image
David Studebaker

David Studebaker has been designing and developing software since 1962 as a developer, consultant, manager, and business owner. In 1967, David coauthored the first general-purpose SPOOL system, an AT&T / IBM joint project. He has been a founding partner in several firms, most recently Studebaker Technology and Liberty Grove Software. David's publications include a decade of technical reviews for ACM Computing Reviews and a number of articles on shop floor data collection. David originated the Packt series of books on programming Dynamics Business Central (aka Dynamics NAV). He has a BS in mechanical engineering from Purdue University and an MBA from the University of Chicago. He is a life member of the Association for Computing Machinery.
Read more about David Studebaker

Christopher D. Studebaker
Christopher D. Studebaker
author image
Christopher D. Studebaker

Chris Studebaker was a certified environmental consultant working with manufacturing facilities to meet national and state regulations before he started working with Navision in 1999. After working on regulatory reporting, data analysis, project management, and subcontractor oversight, Chris has used those skills to sell, develop, and implement NAV for the past 20 years. He has specialized in retail, manufacturing, job shop, and distribution implementations, mostly in high-user-count, high-data-volume applications. Chris acts in a consulting and training role for customers and for peer NAV professionals. He has a Bachelor of Science degree from Northern Illinois University and has done graduate work at Denmark Technical University.
Read more about Christopher D. Studebaker

View More author details
Right arrow
Successful Conclusions
"The expected never happens; it is the unexpected always."
– John Maynard Keynes
"The most powerful designs are always the result of a continuous process of simplification and refinement."
– Kevin Mullet

Each update of Business Central includes new features and capabilities. Microsoft Dynamics 365 Business Central is no exception. It is our responsibility to understand how to apply these new features, use them intelligently in our designs, and develop them with both their strengths and limitations in mind. Our goal in the end is not only to provide workmanlike results but, if possible, to delight our users with the quality of the experience and the ease to use of our products.

In this chapter, we will cover the following topics:

  • Reviewing Business Central objects that contain functions we can use directly or as templates...

Creating new AL routines

Now that we have a good overall picture of how we enable users to access the tools we create, we are ready to start creating our own Business Central AL routines. It is important that you learn your way around the Business Central AL code in the standard product first. You may recall the advice in a previous chapter that the new code we create should be visually and logically compatible with what already exists. If we think of our new code as a guest being hosted by the original system, we will be doing what any thoughtful guest does: fitting smoothly into the host's environment.

An equally important aspect of becoming familiar with the existing code is to increase the likelihood; we can take advantage of the features and components of the standard product to address some of our application requirements. There are at least two types of existing Business Central AL code that we should make use of whenever appropriate.

One group is the callable functions that...

Callable functions

Most of the callable functions in Business Central are designed to handle a very specific set of data or conditions and have no general-purpose use. For example, the routines to update Check Ledger entries during a posting process are likely to apply only to that specific function. If we are making modifications to a particular application area within Business Central, we may find functions that we can utilize, either as is or as models for our new functions.

There are also many functions within Business Central that are relatively general purpose. They either act on data that is common in many different situations, such as dates and addresses, or they perform processing tasks that are common to many situations, such as providing access to an external file. We will review a few such functions in detail, then list a number of others worth studying. If nothing else, these functions are useful as guides for showing you how such functions are used in Business Central. The...

Codeunit 358 DateFilterCalc

This codeunit is a good example of how well-designed and well-written code has long term utility. Except for code changes that are required by Business Central structural changes, this codeunit has changed very little since it originated in Business Central (Navision) V3.00 in 2001. That doesn't mean it is out of date; it means it was well-thought out and complete from its beginning.

Codeunit 358 contains two functions we can use in our code to create filters based on the Accounting Period Calendar. The first is CreateFiscalYearFilter. If we are calling this from an object that has codeunit 358 defined as a global variable named DateFilterCalc, our call would use the following syntax:

DateFilterCalc,CreateFiscalYearFilter
(Filter,Name,BaseDate,NextStep)

The calling parameters are Filter (text, length 30), Name (text, length 30), BaseDate (date), and NextStep (integer).

The second such function is CreateAccountingPeriodFilter, and has the following...

Codeunit 359 – Period Form Management

This codeunit contains three functions that can be used for date handling. These are FindDate, NextDate, and CreatePeriodFormat.

FindDate function

The description of the FindDate function is as follows:

  • Calling parameters (SearchString (text, length 3), Calendar (date table), PeriodType (option, integer))
  • Returns DateFound Boolean

The calling syntax for the FindDate function is as follows:

BooleanVariable := FindDate(SearchString,CalendarRec,PeriodType)

This function is often used in pages to assist with date calculation. The purpose of this function is to find a date in the virtual Date table based on the parameters that are passed. The search starts with an initial record in the Calendar table. If we pass in a record that has already been initialized by positioning the table at a date, that will be the base date; otherwise, the Work Date will be used.

The PeriodType is an options field with the option value choices of day, week, month, quarter, year, and accounting period. For ease of coding, we could call the function with the integer equivalent (0, 1, 2, 3, 4, 5), or set up our own equivalent Option variable...

NextDate function

The description of the NextDate function is as follows:

  • Calling parameters (NextStep (integer), Calendar (the Date table), PeriodType (Option, integer))
  • Returns Integer

The calling syntax for the NextDate function is as follows:

IntegerVariable := NextDate(NextStep,CalendarRec,PeriodType)

NextDate will find the next date record in the Calendar table that satisfies the calling parameters. The Calendar and PeriodType calling parameters for NextDate have the same definition as they do for the FindDate function. However, for this function to be really useful, the Calendar must be initialized before calling NextDate. Otherwise, the function will calculate the appropriate next date from day 0. The NextStep parameter allows us to define the number of periods of PeriodType to move, so as to obtain the desired next date. For example, if we start with a Calendar table positioned on 01/25/16, a PeriodType of quarter (that is 3), and a NextStep of 2, the NextDate will move forward...

CreatePeriodFormat function

The description of the CreatePeriodFormat function is as follows:

  • Calling parameters (PeriodType (Option, integer), Date (date))
  • Returns text, length 10

The calling syntax for the CreatePeriodFormat function is as follows:

FormattedDate := CreatePeriodFormat(PeriodType,DateData)

CreatePeriodFormat allows us to supply a date and specify which of its format options we want through the PeriodType. The function's return value is a ten-character formatted text value, for example, mm/dd/yy or ww/yyyy, mon/yyyy, qtr/yyyy, or yyyy.

Codeunit 365 – Format Address

The functions in the Format Address codeunit do the obvious—they format addresses in a variety of situations. The address data in any master record (Customer, Vendor, Sales Order Sell-to, Sales Order Ship-to, Employee, and so on) may contain embedded blank lines. For example, the Address 2 line may be empty. When we print out the address information on a document or report, it will look better if there are no blank lines. These functions take care of such tasks.

In addition, Business Central provides setup options for multiple formats of City—Post Code—County—Country combinations. The Format Address functions format addresses according to what was chosen in the setup, or was defined in the Countries/Regions page for different Postal areas.

There are over 60 data-specific functions in the Format Address codeunit. Each data-specific function allows us to pass a record parameter for the record containing the raw address...

Codeunit 396 – NoSeriesManagement

Throughout Business Central, master records (for example Customer, Vendor, Item, and so on) and activity documents (Sales Order, Purchase Order, Warehouse Transfer Orders, and so on) are controlled by the unique identification number assigned to each one. This unique identification number is assigned through a call to a function within the NoSeriesManagement codeunit. That function is InitSeries. The calling format for InitSeries is as follows:

NoSeriesManagement.InitSeries(WhichNumberSeriesToUse, LastDataRecNumberSeriesCode, SeriesDateToApply, NumberToUse, NumberSeriesUsed)

The WhichNumberSeriesToUse parameter is generally defined on a Numbers Tab in the Setup record for the applicable application area. LastDataRecNumberSeriesCode tells the function what Number Series was used for the previous record in this table. The SeriesDateToApply parameter allows the function to assign ID numbers in a date-dependent fashion. NumberToUse and the NumberSeriesUsed...

Function models to review and use

It is very helpful when creating new code to have a model that works and that we can study (or clone). This is especially true in Business Central, where there is little or no development documentation available for many of the different functions we would like to use. One of the more challenging aspects of learning to develop in the Business Central environment is learning how to handle issues in the Business Central way. Learning the Business Central way is very beneficial because then our code works better, is easier to maintain, and easier to upgrade. There is no better place to learn the strengths and subtle features of the product than to study the code written by the developers who are part of the inner circle of Business Central creation.

If there is a choice, don't add custom functions to the standard Business Central codeunits. Well segregated customizations in clearly identified custom objects make both maintenance and upgrades easier...

Management codeunits

There are over 150 codeunits with the word Management (or the abbreviation Mgt) as part of their description name (filter the codeunit names using *Management*|*Mgt* – don't forget that such a filter is case-sensitive). Each of these codeunits contains functions whose purpose is the management of some specific aspect of Business Central data. Many are specific to a very narrow range of data. Some are more general because they contain functions we can reuse in another application area (for example, codeunit 396—NoSeriesManagement).

When we are working on an enhancement in a particular functional area, it is extremely important to check the Management codeunits that are utilized in that area. We may be able to use some existing standard functions directly. This will have the benefit of reducing the code we have to create and debug. Of course, when a new version is released, we will have to check to see if the functions on which we relied...

Multi-language system

The Business Central system is designed as a multi-language system, meaning it can interface with users in many languages. The base product is distributed with American English as the primary language, but each local version comes with one or more other languages ready for use. Because the system can be set up to operate from a single database, displaying user interfaces in several different languages, Business Central is particularly suitable for firms operating from a central system serving users in multiple countries. Business Central is used by businesses all over the world, operating in dozens of different languages. It is important to note that, when the application language is changed, it has no effect on the data in the database. The data is not multi-language unless we provide that functionality, by means of our own enhancements or data structure.

The basic elements that support the multi-language feature include the following:

  • Multi-language captioning...

Multi-currency system

Business Central was one of the first ERP systems to fully implement a multi-currency system. Transactions can start in one currency and finish in another. For example, we can create an order in US dollars and accept payment for the invoice in Euros or Bitcoin or Yen. For this reason, where there are money values, they are generally stored in the local currency (referenced as LCY), as defined in the setup. There is a set of currency conversion tools built into the applications, and there are standard (by practice) code structures to support and utilize those tools. Two examples of code segments from the Sales Line table illustrating the handling of money fields, are as follows:

In both cases, there's a function call to ROUND and use of the currency-specific Currency. "Amount Rounding Precision" control value, as shown in the following code snippet screenshot:

As we can see, before creating any modification that has money fields, we must familiarize...

Modifying for Navigate

If our modification creates a new table that will contain posted data and the records contain both Document No. and Posting Date fields, we can include this new table in the Navigate function.

The AL Code for Navigate functionality based on Posting Date and Document No. is found in the FindRecords and FindExtRecords functions of page 344Navigate. The following screenshot shows the segment of the Navigate CASE statement code for the CheckLedgerEntry table:

The code checks READPERMISSION. If that permission is enabled for this table, then the appropriate filtering is applied. Next, there is a call to the InsertIntoDocEntry function, which fills in the temporary table that is displayed in the Navigate page. If we wish to add a new table to the Navigate function, we must replicate this functionality for our new table.

In addition, we must add the code that will call up the appropriate page to display the records that Navigate finds. This code...

Debugging in Business Central

In general, the processes and tools we use for debugging can serve multiple purposes. The most immediate purpose is always that of identifying the causes of errors and then resolving those errors. There are two categories of production errors, which may also occur during development, and Business Central's debugger module is very well-suited to addressing both of these. The Business Central debugger smoothly integrates developing in the Development Environment and testing in the Role Tailored Client (RTC).

The first category is the type that causes an error condition that terminates processing. In this case, the immediate goal is to find the cause and fix it as quickly as possible. The debugger is an excellent tool for this purpose. The second category is the type that, while running to completion successfully, gives erroneous results.

We often find that debugging techniques can be used to help us better understand how Business Central processes work...

Dialog function debugging techniques

Sometimes, the simpler methods are more productive than the more sophisticated tools, because we can set up and test quickly, resolve the issue (or answer a question), and move on. All the simpler methods involve using one of the AL DIALOG functions, such as MESSAGE, CONFIRM, DIALOG, or ERROR. All of these have the advantage of working well in the RTC environment. However, we should remember that none of these techniques conform to the Testing Best Practices in the Help topic: Testing the Application. These should only be used when a quick one-time approach is needed, or when recommended testing practices simply won't easily provide the information needed and one of these techniques will do so.

Debugging with MESSAGE and CONFIRM

The simplest debug method is to insert MESSAGE statements at key points in our logic. This is very simple and, if structured properly, provides us with a simple trace of the code logic path. We can number our messages to differentiate them and display any data (in small amounts) as part of a message, like so:

MESSAGE('This is Test 4 for %1',Customer."No.");

A big disadvantage is that MESSAGE statements do not display until processing either terminates or is interrupted for user interaction. Also, if you create a situation that generates hundreds of messages, you will find it quite painful to click through them individually at process termination.

If we force a user interaction at some point, then our accumulated messages will appear prior to the interaction. The simplest way to force user interaction is to issue a CONFIRM message in the format, as follows:

IF CONFIRM ('Test 1',TRUE) THEN;

If we want to do a simple trace...

Debugging with DIALOG

Another tool that is useful for progress tracking is the DIALOG function. DIALOG is usually set up to display a window with a small number of variable values. As processing progresses, the values are displayed in real time. The following are a few ways we can use this:

  • Simply tracking progress of processing through a volume of data. This is the same reason we would provide a DIALOG display for the benefit of the user. The act of displaying slows down processing somewhat, so we may want to update the DIALOG display occasionally, not on every record.
  • Displaying indicators when processing reaches certain stages. This can be used as a very basic trace with the indicators showing the path taken, so we may gauge the relative speed of progress through several steps.
  • We might have a six-step process to analyze. We could define six tracking variables and display all of them in the DIALOG. We would initialize each variable with values dependent on what we are tracking, such...

Debugging with text output

We can build a very handy debugging tool by outputting the values of critical variables or other informative indicators of progress, either to an external text file or to a table that's been created for this purpose. We need to either do this in single user mode or make it multi-user by including the USER ID on every entry.

This technique allows us to run a considerable volume of test data through the system, tracking some important elements while collecting data on the variable values, progress through various sections of code, and so on. We can even timestamp our output records so that we can use this method to look for processing speed problems.

Following the test run, we can analyze the results of our test more quickly than if we were using displayed information. We can focus on just the items that appear most informative and ignore the rest. This type of debugging is fairly easy to set up and to refine, as we identify the variables or code segments...

Debugging with ERROR

One of the challenges of testing is maintaining repeatability. Quite often, we need to test several times using the same data, but the test changes the data. If we have a small database, we can always back up the database and start with a fresh copy each time. However, that can be inefficient and, if the database is large, impractical. If we are using the built-in Business Central Test functions, we can roll back any database changes so the tests are totally repeatable. Another alternative is to conclude our test with an ERROR function to test and retest with exactly the same data.

The ERROR function forces a runtime error status, which means the database is not updated (it is rolled back to the status at the beginning of the process). This works well when our debugging information is provided by using the debugger or any of the DIALOG functions we just mentioned prior to the execution of the ERROR function. If we are using MESSAGE to generate debugging information...

The Business Central debugger

Debugging is the process of finding and correcting errors. Business Central uses the debugger from Visual Studio Code. The debugger can only be activated from Visual Studio Code.

Activating the debugger

Activating the debugger from the Development Environment is a simple matter of clicking on Debug | Start Debugging (or Shift Ctrl F5). The initial page that displays when the debugger is activated will look as follows (typically, with each session having a different User ID).

When we click on Start Debugging, an empty debugger page will open, awaiting the event that will cause a break in processing and the subsequent display of code detail (Variables), watched variables (Watches), and the Call Stack:

Creating break events

Once the debugger is activated and attached to a session, a break event must occur to cause the debug trace and associated data display to begin. Break events include (but are not limited to) the following:

  • An error occurs that would cause object execution to be terminated
  • A previously set breakpoint is reached during processing
  • The record is read when the Break on Record Write break rule is active
  • The Pause icon is clicked in the popup of the debugger page

Of the preceding events, the two most common methods of starting up a debug trace are the first two, an error, or reaching a previously set breakpoint. If, for example, an error condition is discovered in an operating object, the debugging process can be initiated by doing the following:

  • Activating the debugger
  • Running the process where the error occurs

When the error occurs, the page parts (Variables, Watches, and Call Stack) in the debug window will be populated, and we can proceed to investigate variable...

The debugger window

When viewing AL code in a Designer, breakpoints can be set, disabled, or removed by pressing the F9 key. When viewing AL code in the Code window of the debugger, breakpoints can only be set or removed by pressing the F9 key or clicking on the Toggle icon. Other debugger breakpoint controls are shown in the following screenshot:

The following are the ribbon actions that you can use:

  • Pause: This breaks at the next statement
  • Step Over: This is designed to execute a function without stopping, then break
  • Step Into: This is designed to trace into a function
  • Step Out: This is designed to complete the current function without stopping, and then break.
  • Restart: Stop current session and republish the extension
  • Stop: This stops the current activity but leaves the debugging session active

VARIABLES displays the Debugger Variable List where we can examine the status of all variables that are in scope. Additional variables can be added to the Watch list here, as shown...

Visual Studio Code test-driven development

Business Central includes the enhanced AL Testability feature set, that's designed to support AL test-driven development. Test-driven development is an approach where the application tests are defined prior to the development of the application code. In an ideal situation, the code supporting application tests is written prior to, or at least at the same time as, when the code implementing the target application function is written.

Advantages of test-driven development include the following:

  • Designing testing processes in conjunction with functional design
  • Finding bugs early
  • Preventing bugs reaching production
  • Enabling regression testing, which prevents changes from introducing new bugs into previously validated routines

The AL Testability feature provides test-specific types of codeunits: test codeunits and test running codeunits. Test codeunits contain test methods, UI handlers, and AL code to support Test methods, including...

Other interfaces

Some Business Central systems must communicate with other software or even with hardware. Sometimes, that communication is Inside-Out (that is, initiated by Business Central) and sometimes, it is Outside-In (that is, initiated by the outside connection). It's not unusual for system-to-system communications to be a two-way street, a meeting of peers. To supply, receive, or exchange information with other systems (hardware or software), we will need at least a basic understanding of the interface tools that are part of Business Central.

Because of Business Central's unique data structures and the critical business logic embedded therein, it is very risky for an external system to access Business Central data directly via SQL Server without using AL-based routines as an intermediary.

Business Central has a number of methods of interfacing with the world outside its database. We will review those very briefly here. To learn more about these, we should begin by reviewing...

Client Add-ins

The Business Central Client Add-in API (also known as Client Extensibility) provides the capability to extend the client through the integration of external non-Business Central controls. The Client Add-in API uses .NET interfaces as the binding mechanism between a control add-in and the Business Central framework. Different interfaces and base classes are available to use, or a custom interface can be created. Controls can be designed to raise events that call on the OnControlAddin trigger on the page field control that hosts the add-in. They can also add events and methods that can be called from within AL.

Contrary to the limitations on other integration options, Client add-ins can be graphical and appear on the client display as part of, or mingled with, native Business Central controls. The following are a few simple examples of how Client add-ins might be used to extend client UI behavior:

  • A Business Central text control that looks normal but offers a special behavior...

Client Add-in comments

To take advantage of the Client Add-in capabilities, we will need to develop minimal JavaScript programming skills. Care will have to be taken when designing add-ins that their user interface style complements, rather than clashes with, the UI standards of the Business Central client.

Client Add-ins are a major extension of the Dynamics Business Central system. This feature allows ISVs to create and sell libraries of new controls and new control-based micro-applications. It allows vertically focused partners to create versions of Business Central that are much more tailored to their specific industries. This feature allows for the integration of third-party products, software, and hardware, at an entirely new level.

The Client add-in feature is very powerful. If you learn how to use it, you will have another flexible tool in your kit, and your users will benefit from it.

Business Central development projects – general guidance

Now that we understand the basic workings of the Business Central Visual Studio Code development environment and AL, we'll review the process of software design for Business Central enhancements and modifications.

When we start a new project, the goals and constraints for the project must be defined. The degree to which we meet these will determine our success. Some examples are follows:

  • What are the functional requirements and what flexibility exists within these?
  • What are the user interface standards?
  • What are the coding standards?
  • What are the calendar and financial budgets?
  • What existing capabilities within Business Central will be used?

Knowledge is key

Designing for Business Central requires more forethought and knowledge of the operating details of the application than was needed with traditional models of ERP systems. As we have seen, Business Central has unique data structure tools (SIFT and FlowFields), quite a number of Business Central-specific functions that make it easier to program business applications, and a software data structure (journal, ledger, and so on) that is inherently an accounting data structure. The learning curve to become an expert in the way Business Central works is not easy. Business Central has a unique structure, and the primary documentation from Microsoft is limited to the embedded Help, which improves with every release of the product. The Business Central books published by Packt can be of great help, as are the Business Central Development Team blogs, the blogs from various Business Central experts around the world, and the Business Central forums that were mentioned earlier...

Data-focused design

Any new application design must begin with certain basic analysis and design tasks. This is just as applicable whether our design is for new functionality to be integrated into Business Central or is for an enhancement/expansion of existing Business Central capabilities.

First, determine what underlying data is required. What will it take to construct the information the users need to see? What level of detail and in what structural format must the data be stored so that it may be quickly and completely retrieved? Once we have defined the inputs that are required, we must identify the sources of this material. Some may be input manually, some may be forwarded from other systems, some may be derived from historical accumulations of data, and some will be derived from combinations of all these, and more. In any case, every component of the information needed must have a clearly defined point of origin, schedule of arrival, and format.

Defining the required data views

Define how the data should be presented by addressing the following questions:

  • How does it need to be "sliced and diced"?
  • What levels of detail and summary are required?
  • What sequences and segmentations are required?
  • What visual formats will be used?
  • What media will be used?
  • Will the users be local or remote?

Ultimately, many other issues also need to be considered in the full design, including user interface specifications, data and access security, accounting standards and controls, and so on. Because there are a wide variety of tools available to extract and manipulate Business Central data, we can start relatively simply and expand as appropriate later. The most important thing is to assure that we have all the critical data elements identified and then captured.

Designing the data tables

Data table definition includes the data fields, the keys to control the sequence of data access and to ensure rapid processing, frequently used totals (which are likely to be set up as SumIndex fields), references to lookup tables for allowed values, and relationships to other primary data tables. We will not only need to do a good job of designing the primary tables, but also all those supporting tables containing lookup and setup data. When integrating a customization, we must consider the effects of the new components on the existing processing, as well as how the existing processing ties into our new work. These connections are often the finishing touch that makes the new functionality operate in a truly seamlessly integrated fashion with the original system.

Designing the user data access interface

The following are some of the principle design issues that need to be considered:

  • Design the pages and reports to be used to display or interrogate the data.
  • Define what keys are to be used or available to the users (though the SQL Server database supports sorting data without predefined Business Central AL keys).
  • Define what fields will be allowed to be visible, what the totaling fields are, how the totaling will be accomplished (for example, FlowFields or on-the-fly processing), and what dynamic display options will be available.
  • Define what type of filtering will be needed. Some filtering needs may be beyond the ability of the built-in filtering function and may require auxiliary code functions.
  • Determine whether external data analysis tools will be needed and will therefore need to be interfaced.
  • Design considerations at this stage often result in returning to the previous data structure definition stage to add additional data fields, keys...

Designing the data validation

Define exactly how the data must be validated before it is accepted upon entry into a table. There are likely to be multiple levels of validation. There will be a minimum level, which defines the minimum set of information required before a new record is accepted.

Subsequent levels of validation may exist for particular subsets of data, which, in turn, are tied to specific optional uses of the table. For example, in the base Business Central system, if the manufacturing functionality is not being used, the manufacturing-related fields in the Item Master table do not need to be filled in. However, if they are filled in, they must satisfy certain validation criteria.

As we mentioned earlier, the sum total of all the validations that are applied to data when it is entered into a table may not be sufficient to completely validate the data. Depending on the use of the data, there may be additional validations being performed during processing, reporting, or inquiries...

Data design review and revision

Perform these three steps (table design, user access, data validation) for the permanent data (Masters and Ledgers) and then for the transactions (Journals). Once all of the supporting tables and references have been defined for the permanent data tables, there are not likely to be many new definitions required for the Journal tables. If any significant new supporting tables or new table relationships are identified during the design of Journal tables, we should go back and reexamine the earlier definitions. Why? Because there is a high likelihood that this new requirement should have been defined for the permanent data and was overlooked.

Designing the Posting processes

First, define the final data validations, and then define and design all the ledger and auxiliary tables (for example, Registers, Posted Document tables, and so on). At this point, we are determining what the permanent content of the posted data will be. If we identify any new supporting table or table reference requirements at this point, we should go back to the first step to make sure that this requirement didn't need to be in the design definition.

Whatever variations in data are permitted to be posted, they must be acceptable in the final, permanent instance of the data. Any information or relationships that are necessary in the final posted data must be ensured to be present before posting is allowed to proceed.

Part of the posting design is to determine whether data records will be accepted or rejected individually or in complete batches. If the latter, we must define what constitutes a batch; if the former, it is likely that the makeup of a...

Designing the supporting processes

Design the processes that are necessary to validate, process, extract, and format data for the desired output. In earlier steps, these processes can be defined as "black boxes" with specified inputs and required outputs, but without overdue regard for the details of the internal processes. This allows us to work on the several preceding definitions and design steps without being sidetracked into the inner workings of the output-related processes.

These processes are the cogs and gears of the functional application. They are necessary, but often not pretty. By leaving the design of these processes in the application design as late as possible, we increase the likelihood that we will be able to create common routines and standardize how similar tasks are handled across a variety of parent processes. At this point, we may identify opportunities or requirements for improvement in material that was defined in a previous design step. In that case...

Double-check everything

Lastly, review all the defined reference, setup, and other control tables to make sure that the primary tables and all defined processes have all the information available when needed. This is a final design quality control step.

It is important to realize that returning to a previous step to address a previously unidentified issue is not a failure of the process; it is a success. An appropriate quote that's used in one form or another by construction people, the world over, is Measure twice, cut once. It is much cheaper and more efficient (and less painful) to find and fix design issues during the design phase rather than after the system is in testing or, worse yet, in production.

Designing for efficiency

Whenever we are designing a new modification, we will not only need to design to address the defined needs, but also to provide a solution that processes efficiently. An inefficient solution carries unnecessary ongoing costs. Many of the things that we can do to design an efficient solution are relatively simple:

  • Properly configure system and workstation software (often overlooked)
  • Make sure networks can handle the expected load (with capacity to spare)
  • Have enough server memory to avoid using virtual memory, since virtual memory = disk
  • Most of all, do everything reasonable to minimize disk I/O

Disk I/O

The slowest thing in any computer system is the disk I/O. Disk I/O almost always takes more time than any other system processing activity. When we begin concentrating our design efforts on efficiency, our focus first should be on minimizing disk I/O.

The most critical elements are the design of the keys, the number of keys, the design of the SIFT fields, the number of SIFT fields, the design of the filters, and the frequency of access of data (especially FlowFields). If our system will have five or ten users processing a few thousand order lines per day, and is not heavily modified, we probably won't have much trouble. However, if we are installing a system with one or more of the following attributes (any of which can have a significant effect on the amount of disk I/O), we will need to be very careful with our design and implementation:

  • Large concurrent user count
  • High transaction volumes, especially in data being Posted
  • Large stored data volumes, especially resulting...

Locking

One important aspect of the design of an integrated system such as Business Central, that is often overlooked until it rears its ugly head after the system goes into production, is the issue of locking. Locking occurs when one process has control of a data element, record, or group of records (in other words, part or all of a table) for the purpose of updating the data within the range of the locked data and, at the same time, another process requests the use of some portion of that data but finds it to be locked by the first process.

If a deadlock occurs, there is a serious design flaw wherein each process has data locked that the other process needs, and neither process can proceed. One of our responsibilities as developers or implementers is to minimize locking problems and eliminate any deadlocks.

Locking interference between processes in an asynchronous processing environment is inevitable. There will always be points in the system where one process instance locks out another...

Updating and upgrading

You must be able to differentiate between updating a system and upgrading a system. In general, most of the Business Central development work we will do is modifying individual Business Central systems to provide tailored functions for end user firms. Some of those modifications will be created by developers as part of an initial system configuration and implementation before the Business Central system is in production use. Other such modifications will be targeted at a system that is in day-to-day production to bring the system up to date with changes in business process or external requirements. We'll refer to these system changes as updating.

Upgrading is when we implement a new version of the base AL application code distributed by Microsoft and port all the previously existing modifications into that new version. First, we'll discuss updating, and then we'll discuss upgrading.

Design for updating

Any time we are updating a production system by applying modifications to it, a considerable amount of care is required. Many of the disciplines that should be followed in such an instance are the same for a Business Central system as for any other production application system. However, some disciplines are specific to Business Central and the Visual Studio Code environment.

Increasing the importance of designing for ease of updating is Microsoft's process of providing Business Central updates on a frequent basis so that systems can be kept more up to date with fixes and minor feature enhancements. Keeping up with these Microsoft-provided updates is especially important for multi-tenant systems running in the cloud, that is, systems serving multiple unrelated customers with the software and databases being resident on internet-based server systems.

Customization project recommendations

Even though there are new tools to help us update our Business Central systems, we should still follow good practices in our modification designs and the processes of applying updates. Some of these recommendations may seem obvious. This would be a measure of our personal store of experience and our own common sense. Even so, it is surprising the number of projects that go sour because one (or many) of the following are not considered in the process of developing modifications:

  • One modification at a time
  • Design thoroughly before coding
  • Design the testing in parallel with the modification
  • Use the AL Testability feature extensively
  • Multi-stage testing:
    • Cronus for individual objects
    • Special test database for functional tests
    • Copy of production database for final testing as appropriate
    • Setups and implementation
  • Testing full features:
    • User interface tests
    • System load tests
    • User Training
  • Document and deliver in a predefined, organized manner
  • Follow...

One change at a time

It is important to make changes to objects in a very well-organized and tightly-controlled manner. In most situations, only one developer at a time will make changes to an object. If an object needs to be changed for multiple purposes, the first set of changes should be fully tested (at least through the development testing stage) before the object is released to be modified for a second purpose.

If the project is so large and complex or deadlines are so tight that this one modification at a time approach is not feasible, we should consider the use of software source management systems, such as the Git or Team Foundation systems, which can easily help you separate multi-modifications.

Similarly, we should only be working on one functional change at a time. As developers, we might be working on changes in two different systems in parallel, but we shouldn't be working on multiple changes in a single system simultaneously. It's challenging enough to keep all...

Testing

As we all know, there is no substitute for complete and thorough testing. Fortunately, Business Central provides some very useful tools, such as the ones we previously discussed, to help us to be more efficient than we might be in some other environment. In addition to the built-in testing tools, there are also some testing techniques that are Business Central-specific.

Database testing approaches

If the new modifications are not tied to previous modifications and specific customer data, then we may be able to use the Cronus database as a test platform. This works well when our target is a database that is not heavily modified in the area on which we are currently working. As the Cronus database is small, we will not get lost in large data volumes. Most of the master tables in Cronus are populated, so we don't have to create and populate this information. Setups are done and generally contain reasonably generic information.

If we are operating with an unmodified version of Cronus, we have the advantage that our test is not affected by other preexisting modifications. The disadvantage, of course, is that we are not testing in a wholly realistic situation. Because the data volume in Cronus is so small, we are not likely to detect a potential performance problem.

Even when our modification is targeted at a highly modified system, where those other...

Testing in production

While it is always a good idea to thoroughly test before adding our changes to the production system, sometimes, we can safely do our testing inside the production environment. If the modifications consist of functions that do not change any data and can be tested without affecting any ongoing production activity, it may be feasible to test within the production system.

Examples of modifications that may be able to be tested in the live production system can range from a simple inquiry page, a new analysis report, or when exporting data that is to be processed outside the system to a completely new subsystem that does not change any existing data. There are also situations where the only changes to the existing system are the addition of fields to existing tables. In such a case, we may be able to test just a part of the modification outside production, and then implement the table changes to complete the rest of the testing in the context of the production system...

Using a testing database

From a testing point of view, the most realistic testing environment is a current copy of an actual production database. There are sometimes apparently good excuses as to why it is just too difficult to test using a copy of the actual production database.

Don't give in to excuses—use a testing copy of the production database!

Remember, when we implement our modifications, they will receive the test by fire in the environment of production. We need to do everything within reason to assure success. Let's review some of the potential problems involved in testing with a copy of the production database and how to cope with them:

  • It's too big: This is not a good argument relative to disk space. Disk space is so inexpensive that we can almost always afford plenty of disk space for testing. We should also make every possible useful intermediate stage backup. Staying organized and making lots of backups may be time-consuming, but when done well and...

Testing techniques

As experienced developers, we are already familiar with good testing practices. Even so, it never hurts to be reminded about some of the more critical habits to maintain.

Any modification greater than trivial should be tested in one way or another by at least two people. The people that are assigned should not be a part of the team that created the design or coded the modification. It would be best if one of the testers is an experienced user, because users seem to have a knack (for obvious reasons) of understanding how the modification operates compared to how the rest of the system acts in the course of day-to-day work. This helps us obtain meaningful feedback on the user interface before going into production.

One of the testing goals is to supply unexpected data and make sure that the modification can deal with it properly. Unfortunately, those who were involved in creating the design will have a very difficult time being creative in supplying the unexpected. Users...

Deliverables

Create useful documentation and keep good records of testing processes and results. Testing scripts, both human-oriented and AL Testability Tool-based, should be retained for future reference. Document the purpose of the modifications from a business point of view. Add a brief, but complete, technical explanation of what must be done from a functional design and coding point of view to accomplish the business purpose. Briefly record the testing that was done. The scope of the record keeping should be directly proportional to the business value of the modifications being made and the potential cost of not having good records. All investments are a form of insurance and preventive medicine. We hope they won't be needed, but we have to allow for the possibility that they might be needed.

More complex modifications should be delivered and installed by experienced implementers—perhaps even by the developers themselves. Small Business Central modifications may be transmitted...

Finishing the project

Bring projects to conclusion, and don't let them drag on through inaction and inattention open issues get forgotten and then don't get addressed. Get it done, wrap it up, and then review what went well and what didn't go well, both for remediation and for application to future projects.

Set up ongoing support services as appropriate and move on to the next project. With the flexibility of the Role Tailored Client allowing page layout changes by both super users (configuration) and users (personalization), the challenge of user support has increased. No longer can the support person expect to know what display the user is viewing today.

Consequently, support services will almost certainly require the capability of the support person to be able to view the user's display. Without that, it will be much more difficult, time-consuming, and frustrating for the two-way support of personnel-user communication to take place. If it doesn't...

Plan for upgrading

The ability to upgrade a customized system is a very important feature of Business Central. Most other complex business application systems are very difficult to customize at the database-structure and process-flow levels. Business Central readily offers this capability. This is a significant difference between Business Central and the competitive products in the market.

Complementing the ability to be customized is the ability to upgrade a customized Business Central system. While not a trivial task, at least it is possible with Business Central. In many other systems, the only reasonable path to an upgrade is often to discard the old version and reimplement with the new version, recreating all customizations. Not only is Business Central unusually accommodating to being upgraded, but with each new version of the system, Microsoft has enhanced the power and flexibility of the tools it provides to help us do upgrades.

We may say, why should a developer care about upgrades...

Benefits of upgrading

To ensure that we are on common ground about why upgrading is important to both the client and the Business Central partner, the following is a brief list of some of the benefits available from an upgrade:

  • Easier support of a more current version
  • Access to new features and capabilities
  • Continued access to fixes and regulatory updates
  • Improvements in speed, security, reliability, and user interface
  • Assured continuation of support availability
  • Compatibility with necessary infrastructure changes, such as new operating system versions
  • Opportunity to do needed training, data cleaning, and process improvement
  • Opportunity to resolve old problems, to do postponed housekeeping, create a known system reference point

This list is not complete, and not every benefit will be realized in any one situation.

Coding considerations

The most challenging and important part of an upgrade is porting code and data modifications from the older version of a system to the new version. When the new version has major design or data structure changes in an area that we have customized, it is quite possible that our modification structure will have to be redesigned and, perhaps, even be recoded from scratch.

On the other hand, many times, the changes in the new product version of Business Central don't affect much existing code, at least in terms of the base logic. If our modifications are done properly, it's often not very difficult to port custom code from the older version into the new version. By applying what some refer to as low-impact coding techniques, we can make the upgrade job easier and, thereby, less costly.

Low-impact coding

We have already discussed most of these practices in other chapters, but will review them here, in the context of coding to make it easier to upgrade. We won't be able to follow each of these, but will have to choose the degree to which we can implement low-impact code and which of these approaches fit our situation (this list may not be all-inclusive):

  • Separate and isolate new code
  • Create functions for significant amounts of new code that can be accessed using single codeline function calls
  • Either add independent codeunits as repositories of modification functions or, if that is overkill, place the modification functions within the modified objects
  • Add new data fields; don't change the usage of existing fields
  • When the functionality is new, add new tables rather than modifying existing tables
  • For minor changes, modify the existing pages, or else copy and change the clone pages
  • Copy, then modify the copies of reports and XMLports, rather than modifying the...

Supporting material

With every Business Central system distribution, there have been some reference guides. These are minimal in Business Central. There are previously published guides available but, sometimes, we have to search for them. Some were distributed with previous versions of the product, but not with the latest version. Some are posted at various locations on PartnerSource, or another Microsoft website. Some may be available on one of the forums or on a blog.

Be a regular visitor to websites for more information and advice on AL, Business Central, and other related topics. The https://dynamicsuser.net/ and https://mibuso.com/ websites are especially comprehensive and well attended. Other smaller or more specialized sites also exist. Some of those that are available at the time of writing this book are as follows:

Summary

We have covered many topics in this book, with the goal of helping you become productive in AL development with Dynamics Business Central. Hopefully, you've found your time spent with us to be a good investment. From this point on, your assignments are to continue exploring and learning, enjoy working with Business Central, Visual Studio Code, and AL, and to treat others as you would have them treat you.

"We live in a world in which we need to share responsibility. It's easy to say "It's not my child, not my community, not my world, not my problem." There are those who see the need and respond. Those people are my heroes."
– Fred Rogers
"Be kind whenever possible. It is always possible."
– Dalai Lama

Questions

  1. Which one of the following provides access to several libraries of functions for various purposes that are widely used throughout the NAV system? Choose one:
    • Codeunit 412—Common Dialog Management
    • Codeunit 408—Dimension Management
    • Codeunit 396—NoSeriesManagement
    • Codeunit 1—Application Management
  2. The Help files for Business Central cannot be customized by partner or ISV developers. True or false?
  3. Which of the following are good coding practices? Choose three:
    • Careful naming
    • Good documentation
    • Liberal use of wildcards
    • Design for ease of upgrading
  4. Custom AL code is not allowed to call functions that exist in the base Microsoft that was created for Business Central objects. True or false?
  1. Business Central's multi-language capability allows an installation to have multiple languages active at one time. True or false?
  2. Designing to minimize disk I/O in Business Central is not important because SQL Server takes care of everything. True or false...
lock icon
The rest of the chapter is locked
You have been reading a chapter from
Programming Microsoft Dynamics 365 Business Central - Sixth Edition
Published in: Apr 2019Publisher: PacktISBN-13: 9781789137798
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
undefined
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime

Authors (3)

author image
Marije Brummel

Author, programmer, consultant, project manager, presenter, evangelist, sales person, and a trainer. It's next to impossible to find someone as experienced as Marije Brummel in the Business Central community. Marije received the Microsoft MVP and the NAVUG All-Star awards among several others. She has chaired the Dynamics Credentialing committee and has authored official Microsoft Exam materials. She's the go-to girl for performance troubleshooting and upgrade challenges. One of her biggest achievements was introducing Design Patterns into the Business Central community. Her books, blog articles, and YouTube videos have influenced almost everyone involved with Business Central. She enjoys the outdoors with her dog and loves spending time with her family.
Read more about Marije Brummel

author image
David Studebaker

David Studebaker has been designing and developing software since 1962 as a developer, consultant, manager, and business owner. In 1967, David coauthored the first general-purpose SPOOL system, an AT&T / IBM joint project. He has been a founding partner in several firms, most recently Studebaker Technology and Liberty Grove Software. David's publications include a decade of technical reviews for ACM Computing Reviews and a number of articles on shop floor data collection. David originated the Packt series of books on programming Dynamics Business Central (aka Dynamics NAV). He has a BS in mechanical engineering from Purdue University and an MBA from the University of Chicago. He is a life member of the Association for Computing Machinery.
Read more about David Studebaker

author image
Christopher D. Studebaker

Chris Studebaker was a certified environmental consultant working with manufacturing facilities to meet national and state regulations before he started working with Navision in 1999. After working on regulatory reporting, data analysis, project management, and subcontractor oversight, Chris has used those skills to sell, develop, and implement NAV for the past 20 years. He has specialized in retail, manufacturing, job shop, and distribution implementations, mostly in high-user-count, high-data-volume applications. Chris acts in a consulting and training role for customers and for peer NAV professionals. He has a Bachelor of Science degree from Northern Illinois University and has done graduate work at Denmark Technical University.
Read more about Christopher D. Studebaker