Microsoft System Center Data Protection Manager 2012 SP1

By Steve Buchanan (MVP) , Islam Gomaa , Robert Hedblom and 1 more
    Advance your knowledge in tech with a Packt subscription

  • Instant online access to over 7,500+ books and videos
  • Constantly updated with 100+ new titles each month
  • Breadth and depth in over 1,000+ technologies
  1. What is Data Protection Manager?

About this book

Microsoft System Center Data Protection Manager is a centralized data protection solution. DPM is used for data protection and recovery for Microsoft workloads.

Data Protection Manager allows backup and recovery of Microsoft Workloads, including SQL Server, Exchange, Sharepoint, Client Computers, and Hyper-V. Enabling disk and tape-based backup methods, DPM also allows central management of the system state and “Bare-Metal Recovery”.

Microsoft System Center Data Protection Manager 2012 SP1 is a guide for administrators of System Center Data Protection Manager. By the end of this book, users will be able to carry out automated installs, migrate DPM to new hardware, set up custom reporting, use the DPM central console, and implement offsite DPM strategies such as chaining,  monitoring, and cyclic protection.

In this book you will gain insight from Microsoft Most Valued Professionals into the new features in DPM 2012 along with an understanding of the core tasks that administrators will face, including installing and configuring DPM 2012, workload protection, and managing the system. It will also show administrators how to effectively create backups of the protected workloads and use these backups to recover from a disaster.

It will also contain information on backup networks, client protection, and how to automate tasks in DPM to make your job as an administrator easier.

After reading this book you should be confident enough to master protecting your organizations data with Microsoft System Center Data Protection Manager.

Publication date:
June 2013


Chapter 1. What is Data Protection Manager?

This chapter will give you a good understanding of what System Center Data Protection Manager (DPM) is and how it works, using the underlying components in the operating system.

There are many different backup software vendors that claim the market today. They have all got one thing in common, they perform backups. The big difference between third-party backup software and DPM is that DPM isn't a backup software, it's a restore product. This was the primary idea from Microsoft when introducing DPM to the market. You shouldn't need to be a DBA to restore your SQL databases nor should you need to be a SharePoint administrator to be able to perform fast, optimized, and fully supported restore operations in your Microsoft environment. As a DPM administrator, you will have the ability to perform all restore operations possible in your datacenter or smaller server environments.

DPM uses many different components and functions to be able to give you that great experience when protecting your Microsoft environment. Though DPM relies on different components in the operating system, there are three different key components that you must be aware of:

  • PowerShell


  • Volume Shadow Copy Services (VSS)

All the configurations made in DPM regarding deployed agents, throttling, protection groups, and so on, are stored in a local or remote SQL database called DPMDB. It is very important that you backup your DPMDB database when it comes to restoring your DPM server. The DPMDB database can be placed in a local SQL that is also shipped with your DPM media or it can be placed on a remote SQL that is already in place in your Microsoft environment.

VSS is the most important component. VSS gives DPM the ability to make online snapshots of online and live data that are read during the backup process. In this chapter we will cover:

  • Planning for your DPM deployment

  • The Windows applications

  • The DPM disk pool

  • Dependent services, local accounts, and groups

  • VSS

  • How does the DPM agent operate?

  • A GUI walkthrough

  • Additional functions and roles within DPM

  • PSDataSourceConfig.XML

  • Troubleshooting backups

  • Upgrading scenarios


Planning for your DPM deployment

When it comes to planning your deployment of DPM there are several scenarios you need to consider. The first thing is the number of DPM servers you would like to deploy, whether to use a backup network or not, agent deployment, the total size of the DPM disk pool, and so on. First, let's have a look at the hardware requirements.

Hardware requirements

There is a major difference between minimum requirements and recommended requirements, regarding the performance of the DPM server. In the planning phase, you probably have some expectations regarding what performance DPM will have in your environment.

Remember that DPM stores its configurations in SQL (DPMDB) and if you are using a local SQL installation, you may consider using a slightly higher amount of RAM than the recommended requirements. Since hardware isn't a big cost or investment for companies these days, you may consider buying hardware that will give DPM of the hardware resources it really needs.

Minimum requirements

The minimum hardware requirements are as follows:


1 GHz dual-core CPU


4 GB

Page file

0.2 percent of all combined size of all recovery point volumes

Disk space

DPM installation location: 3 GB

Database files drive: 900 MB

System drive: 1 GB

DPM disk pool

1.5 times the size of the protected data

Recommended requirements

The recommended hardware requirements are as follows:


2.33 GHz quad-core CPU


8 GB

Page file

1.5 times the amount of RAM

Disk space

Always has at least 3 GB of free disk space on the volume that the DPM is installed on.

DPM disk pool

1.5 times the size of the protected data

Limitations of DPM

Depending on the load you put on the DPM server, it will be able to protect different numbers of servers. In your DPM deployment, it is important that you are aware of the limitation based on the minimum requirements of DPM.

There are some guidelines you should be aware of. First off, based on the minimum hardware requirements, a DPM server can protect 75 servers and 150 clients. The DPM disk pool can have a total number of 600 volumes, of which 300 are replica volumes and 300 are recovery points.

In the disk pool you can have 64 recovery points for file data and 512 online snapshots for other workloads.

Based on the minimum requirements, a DPM server can have 80 TB of disk storage in the disk pool and 40 TB of this is the maximum recovery point size.

DPM is a 64-bit software that can protect both 32-bit and 64-bit operating systems. DPM must be installed on a 64-bit operating system.


The Windows applications

DPM was designed to be fully supported and fully optimized for backup, restore, and disaster recovery scenarios of the Windows workloads. Since DPM only follows a predefined definition from the product groups that states the backup and restore operation, this will give you an advance regarding restore scenarios compared with different vendors. DPM protects Windows applications that have a defined VSS writer. If these Windows applications are clustered, DPM will be fully aware of the cluster configuration and also inform you if you haven't installed a DPM agent on all of your cluster members.

The Exchange server

DPM protects the Exchange Windows application with the following Service Pack levels:

  • 2003 SP2

  • 2007

  • 2010

  • 2013

The SQL Server

DPM protects the following versions of the SQL Windows applications:

  • 2000 SP4

  • 2005 SP1

  • 2008

  • 2008 R2

  • 2012


DPM protects the following versions of the SharePoint Windows applications:

  • Windows SharePoint Services 3.0

  • Windows SharePoint Services 3.0 SP Search

  • Microsoft Office SharePoint Server 2007

  • SharePoint 2010

  • SharePoint 2013

Virtual platforms

DPM protects the following virtual platforms:

  • Hyper-V 1.0

  • Hyper-V 2.0

  • Hyper-V 3.0

The Windows clients

DPM protects the following Windows clients:

  • Windows XP SP2

  • Vista

  • Windows 7

  • Windows 8

The system state

DPM can protect the system state as a workload (the Active Directory).


The DPM disk pool

Before you can start protecting a production environment, you must attach a disk or disks to the DPM disk pool to be able to perform fast disk recovery.

The choice of disk type or technology is really made easy with DPM. The only important part is that the storage used for the DPM disk pool must be presented as a local attached storage, which means that SAN, NAS, DAS, and local disks will work.

You cannot use USB or IEEE 1394 FireWire disks since they are presented as removable storage in the operating system.

Since the DPM disk pool is based on the disk management and its underlying technologies, there are some limitations that you must be aware of:

  • Master Boot Record (MBR) disks have a 2 TB physical disk limit.

  • Don't make your GPT disk larger than 17 TB even if Microsoft supports it. This is a recommendation from the DPM development group.

  • The NTFS supports up to 16 TB volume size using the default cluster size.

  • The Virtual Disk Service (VDK) supports up to 32 member spanned volumes, which means that you shouldn't use more than 32 disks in the DPM disk pool.

  • Don't exceed 80 TB of storage for production data in the DPM disk pool with a maximum recovery point size of 40 TB.

  • You can have up to 600 volumes in your DPM disk pool.

The RAID levels for the disk pool

When it comes to planning the DPM disk pool, selecting the RAID level is a strategic choice since this will be one area that will give you good or poor performance of the DPM disk pool.

There are four categories that you must consider when planning for the DPM disk pool:

  • Capacity

  • Cost

  • Reliability

  • Performance and scalability

Many companies will use the RAID 5 for their RAID level since this gives you an ok score in all four categories. One thing that is often forgotten is the actual number of disks that could be included in a RAID 5 before it will impact the reliability and performance. This differs among different vendors and you should verify the maximum limits permitted from each storage vendor.

The following matrix will give you a good understanding of the RAID level you should choose to fit your company performance need and disk cost. The value 1 in the matrix is poor and 4 is very good.

RAID level




Performance and scalability





























































If your company would like to have good performance in their DPM disk pool you should choose the RAID 10 level. This choice isn't the most cost effective but gives you great performance.

Software controllers versus hardware controllers

Regarding the choice of software versus hardware, Microsoft always recommends that you use a hardware controller. DPM will work with a software controller but if you are looking for stability, performance, and reliability for your DPM disk pool, you should always use a hardware controller.

The sector size

When planning your DPM disk pool for an enterprise deployment, there are two critical issues that you must consider:

  • How the data stream is being written

  • The size of the data being written to disk

This is important in those scenarios where you need to plan your SAN being used for the DPM disk pool. DPM will write the data in a sequential I/O with the size of 64 KB.

The custom volumes

DPM 2012 has some auto-heal functions; one of these is automatically growing the volumes that were introduced in DPM 2010. In some cases you might like to place your more important or critical protected production data on a storage solution that has a better I/O performance for your restore process. As a DPM administrator, the only way to choose which disk in the DPM disk pool to host the protected data is to use the custom volumes. Consider the scenario where you would like to place your protected Exchange mailbox databases on a performance SAN instead of cheaper storage so you can manage your SLA. A custom volume can also be encrypted.

By using the custom volumes you will be able to manage the creation of the volume for the replica and the volume for the recovery point yourself in disk management. During the creation of a protection group, you can associate the created volumes with the data source you want to protect. The custom volumes will not grow automatically and, as an administrator, you need to be able to increase the size when needed.


DPM doesn't do deduplication for the DPM disk pool. It can be done by using third-party software or by using hardware that performs deduplication on the disks that are presented to the DPM server operating system.

For the software deduplication there is one piece of vendor software that you should use. The software name is BitWackr and the vendor is Exar.

For hardware-based deduplication, there are two options. If your SAN supports deduplication for the disks that will be used for the DPM disk pool then you will be able to have the deduplicated data in your disk pool. The second option is to use a product called CRUNCH from the company BridgeSTOR.


Dependent services, local accounts, and groups

After the installation of DPM, you will have some new services running in your operating system and also two specific accounts that you will have been prompted to enter a password for. We will now explain the purpose of these services and the local accounts.


After the installation is finished, the following DPM processes are present in your DPM server's operating system:

  • DPM

  • DPM AccessManager

  • DPM Agent Coordinator

  • DPM CPWrapper

  • DPM Writer




The DPM service is used by the DPM server to implement and manage shadow copy creation and synchronization of your production servers.

The DPM AccessManager service

The DPM AccessManager service will manage access to the DPM server.

The DPM Agent Coordinator service

When you are deploying, updating, or uninstalling the agent, the DPM Agent Coordinator service is the service that manages these processes.

The DPM CPWrapper service

The DPM CPWrapper service is used for the DCOM-WCF bridge service in association with the dpmcmd proc. It is used when wrapping the data for the certificate-based authentication (CBA) protection.

The DPM Writer service

The DPM Writer service manages the backed up shadow copies of the replicas. The DPM Writer service is also used when you are backing up the local DPMDB or reporting databases.

The DPMLA service

The DPMLA service is used by DPM for managing the libraries attached to the DPM.

The DPMRA service

The DPMRA service is the DPM replication agent and is found on the protected servers and also on the DPM server. The purpose is to back up and restore file and application data to the DPM.

Local accounts and groups

During the installation process of DPM, you will be prompted to type in a password for two accounts that will be placed locally on the DPM server. Both accounts are low-privilege accounts in the operating system. The accounts are as follows:



The DPMR$YOUR_DPM_SERVER_NAME account is used by the local SQL Server reporting services with the purpose of generating reports in the DPM console.

The MICROSOFT$DPM$Acct account is used by the local SQL Server and SQL agent services.

There are also six groups, as follows:

  • DPMDBReaders$your_dpm_server_name: This contains the computer account for your DPM server, so it has the privilege to read information in the DPMDB

  • DPMDRTrustedMachines: This contains the computer account for the secondary DPM server associated with your DPM server

  • DPMRADcomTrustedMachines: This contains the primary and secondary DPM servers' computer accounts

  • DPMRADmTrustedMachines: This contains the computer account that has an associated DPM agent with your DPM server

  • MSDPMTrustedMachines: This contains the computer accounts of those production servers that have an associated DPM agent with the DPM server

  • MSDPMTrustedUsers: This is used for the centralized management features


Volume Shadow Copy Services (VSS)

The VSS is a key feature of the DPM backup and restore processes for your Microsoft production environment. For a few minutes you will get a deep dive into how VSS works and "what makes it tick".

VSS was first introduced in the Windows Server 2003 release and has been developed since. The VSS enables you to make a backup of your production servers while they are still running their production processes.

The VSS consists of four different blocks:

  • The VSS requester: The DPM agent is a requester and the purpose of this is to initiate a request for a snapshot to happen.

  • The VSS writer: SQL, Exchange, SharePoint, and so on all have a defined VSS writer. The VSS writer guarantees that there is a consistent data set for backup.

  • The VSS provider: The VSS provider is software- or hardware-based. The VSS provider creates and maintains the shadow copies. By default, you are using a software provider that resides within the operating system. The software provider uses a copy-on-write technique that will be explained shortly.

  • The VSS service: To make the requester, writer, and provider work together, you will need a coordination service. The VSS service is the coordinator that makes the communication between the different components work.

The creation of a shadow copy

Let's have a look at how the different components of the shadow copy services interact with each other to be able to make a consistent shadow copy of your production environment. The following diagram is a graphical explanation of the process:

The DPM agent sends a query to the VSS to enumerate the writers and the writer metadata within the protected servers' operating system and prepare for the creation of a shadow copy:

  1. The VSS writer creates an XML file that will describe the components and data stores that need to be included in the backup and also a definition of the restore process. The information is transferred to the VSS that will provide the VSS Requestor with the VSS writer's description. The VSS Requestor will select the components for the backup process.

  2. The VSS will receive the VSS Requestor's choice for backup and will instruct the VSS writers to prepare their data for creating a shadow copy.

  3. The VSS writer will complete all open transactions, rolling transaction logs, and flushing caches. When this process is done, the VSS writer notifies the VSS that the data is ready to be shadow copied.

  4. The VSS instructs the VSS writers to freeze their write I/O requests for that specific application. During the freeze state, the shadow copy is created. This takes just a few seconds but there is a time-out limit of 60 seconds. The shadow copy service will flush the file system buffer and freeze the filesystem. This process makes the recording of the system metadata and verifies that it is correct and that the data that will be shadow copied is written in a consistent order.

  5. The VSS initiates the provider to create a shadow copy. This takes 10 seconds and, during this time, the write I/O is frozen. However, you are still able to read the data being processed.

  6. The VSS releases the file system write I/O.

  7. The VSS tells the application to un-freeze the I/O requests.

  8. If any error occurs then the requester can retry the process.

  9. If the shadow copy creation was successful the VSS returns the location of the files to the VSS Requestor.

A different creation of a shadow copy

When the VSS coordinates a creation of a shadow copy, there are three different techniques to achieve this:

  • Complete copy: This technique makes a full copy or a clone of a disk

  • Copy-on-write: This is a technique that only copies data that has changed and is used by the DPM

  • Redirect-on-write: When the original volume receives a change, the change is made to another volume that stores the shadow copy storage area


How does the DPM agent operate?

The DPM agent is the communication channel between the production server that is protected with DPM and the DPM server. There are several important things to know regarding how DPM agent works and why.

Distributed Component Object Model (DCOM)

Distributed Component Object Model (DCOM) is the technology for the communication between the software components for computers on a network.

DCOM objects that reside within the operating system are located in Administrative Tools | Component Services. If you expand Component Services | Computers | My Computer | DCOM Config you will see all the DCOM objects.

The DCOM object for the DPMRA service is most significant for the backup and restore operation. Within the security settings for the DPM RA service, you will find the security settings for launching and activation. If you are looking at a production server that is protected with DPM, you will find the computer account for the primary (and secondary) DPM server. These computer accounts must be allowed to have the following permissions:

  • Local launch

  • Remote launch

  • Local activation

  • Remote activation

Direction of communication

When you are protecting a production server or a Windows client, the communication is initialized in different ways:

  • In a production server scenario, the DPM server initializes the communication

  • In a Windows client scenario, the DPM agent initializes the communication

The firewall settings for DPM

The following is a list of the TCP and UDP ports used by the DPM communication. If the firewall is not configured correctly DPM will not work:




53 UDP




135 TCP dynamic allocation 1024-65535


137 UDP

138 UDP

139 UDP

445 TCP




5718 TCP


5719 TCP

Underlying technologies

When DPM is performing its backups and restore operations, there are several underlying technologies that are used to be able to track those block-level changes that are associated with a Windows application or files.

Change Journal

The Change Journal was first introduced in the Windows 2000 server operating system and has been developed over the years. The Change Journal enables you to keep track of the changes made to files on an NTFS formatted volume. The Change Journal exists on the volume itself and is stored as a sparse file of each volume present in the operating system.

The File System filter

The File System Filter is a driver that intercepts requests targeted at a filesystem. By doing the interception, the File Filter driver can extend or replace functionality that is provided by the original target of the request.

The DPM File filter

The DPM File filter is the technology that provides the delta change tracking of a protected volume.


A GUI walkthrough

The first thing you will discover in the new GUI of DPM is that DPM has got the same look as the other System Center family applications. The new GUI of DPM enables you to navigate through the product with ease. You now have the ability to work with ribbons and outlook navigation. The console is still based on Microsoft Management Console (MMC) but this doesn't mean that you can attach your DPM server console via MMC on other operating systems. If you wish to administrate your DPM server, you should use the Remote Administration function.

Let's take a look at the different task areas in the GUI:

The Navigation bar

The DPM console consists of the following five buttons:

  • Monitoring

  • Protection

  • Recovery

  • Reporting

  • Management

The different buttons will provide you with different management tasks or scenarios and we will start off by looking at the Monitoring task area.


Regardless of what is going on in your DPM environment or DPM server, the Monitoring task pane will give you the information to see the health of your SCDPM server. The Monitoring task pane consists of two parts: Alerts and Jobs. They can both be filtered with the new context bar feature that resides at the top of the display pane.


There are three types of Alerts that DPM will provide:

  • The Critical alerts are alerts regarding functions or features that have failed their backup or restore process of the production environment.

  • The Warning alerts inform you that something needs your attention.

  • The Information alerts inform you of the result of a restore and so on. This type of alert is just information regarding the result of a successful operation.

As a DPM administrator, you can also deactivate alerts. To do this you will need to right-click on Alert and choose Show inactive alerts. Inactivating an alert will clear that specific alert from the console. If the error reoccurs, a new alert will be published.

You can also subscribe to alerts via e-mail. If you choose to do this, your DPM server will send your alerts to a specified e-mail address or addresses. To do this you must first define an SMTP server that DPM has the rights to use and then configure your notifications. To do this click on the Subscribe icon in your ribbon. If you cannot see the Subscribe icon, remember that the ribbon interacts with your navigation in the console, click on an Alert type and the Subscribe icon should appear. Clicking on the Subscribe icon, a new window will appear, click on the SMTP tab and fill in the information regarding your SMTP server that should be used for this operation. When you have finished filling in the configuration for your SMTP server, click on the Notifications tab. Now choose the different alert types that you want to have forwarded and fill in the recipient list. To separate multiple e-mail addresses use a comma. For example: , .

You can now press the Send Test Notification button and shortly an e-mail will appear in those specified e-mail inboxes.


When you schedule a backup within a protection group, the backup schedule will be presented in the DPM console within the standard filters. There are four Default Filters:

  • All jobs

  • Al jobs in progress

  • Failed jobs for yesterday and today

  • Today's jobs

You can also create custom filters, which will consist of different job types and status. To create a custom filter, click on filters in the DPM console and then click on the Create button. Now a window appears, enter you filter name, set the time interval, and choose what job types and statuses should be used in your custom filter. You can also narrow down the output of the filter by choosing explicit computers or protection groups under the Protection tab. If your custom filter should include information regarding time elapsed or data transferred for your tape-based backup, this is defined under the Other tab.


In the Protection area, you will need to define your protection groups that contain the backup schedules for your Microsoft environment.

Facets pane

Outlook Navigation has two parts: Data Source Health and All Protection groups. These two parts are filters that help DPM administrators filter the information regarding the DPM server's protection group and health:


Ribbons enable you to create new protection groups, modify them, and delete them. The ribbon is context sensitive meaning that, depending on your selection, you will get other tasks presented in the ribbon:


You can also perform an optimization on protection group level by using Enable on-the-wire compression. When you are using this function, the DPM agent will compress the data it backs up before sending it to the DPM server. This will not impact the performance of your production environment severely, but it will take some CPU and RAM.

If you are looking for what tapes are associated to a specific protection group, just mark the Protection group area and click on the View tapes list. A new window will appear and prompt you with that information. If you want to perform a consistency check on your data sources just mark the protection group and click on the Consistency check ribbon button:

Resume backups

Resume backups will clear out all the VSS inconsistencies and synchronize the replicas and can be done for disk tapes and Azure:

With Online protection, you can configure your DPM server to replicate its data to Azure.

With Self service recovery, you can configure your DPM server to let the DBAs in your organization restore SQL databases to an alternative location without contacting the DPM administrator.

With Tape catalog retention, you specify the tape catalog retention and set an alert limit for the growth of the DPMDB.

In the Protection area you can create a recovery point status report by clicking on the Recovery point status ribbon button.

To download the latest update available, you click on the Check updates ribbon button. To determine your DPM version and applied updates, click on the About DPM ribbon button as seen in the following screenshot:


In the Recovery task area you will see what DPM is all about, restore.

In the Navigation pane you will see Browse. In the Filter servers text field you enter a server name that you want to restore data from and hit Enter. DPM will provide you with a list of all the servers and data sources that were found from your search under Recoverable Data.

Under Search in the facets pane you will be able to make your searches for recoverable data within three different categories:

  • Files and folders

  • Exchange mailboxes

  • SharePoint

Within those three search categories you will be able to perform a more detailed search.

On the right side in the display pane you will see all of your recoverable items that were the output of your searches. In the calendar you will see some dates in bold numbers; this is an indication that DPM has recovery points for those dates. After you have chosen your recovery time, you can then restore a recoverable item by right-clicking on the data source and choosing Recover.


It is important to know the status of your DPM server and the present recovery point status. By default, DPM is shipped with six standard reports that will provide you with information regarding different areas in the DPM server. The reports are:

  • Disk Utilization: This report provides you with information regarding disk capacity, disk allocation, and disk usage in the DPM storage pool.

  • Recovery: The Recovery report provides you with details about recovery items and statistics of recovery jobs.

  • Recovery Point Status: This report provides you with the information if there is a recovery point present or not within the defined time window.

  • Status: This report provides the status of all recovery points for a specific time period.

  • Tape Management: This report provides details for managing tape rotation.

  • Tape Utilization: This report provides information on the trends for the tape utilization and capacity planning.

To keep a track of changes in your DPM server environment, you are now able to schedule your creations of reports for future comparing. You are also able to subscribe to the reports after you have created them, but this feature needs to have a defined SMTP server. The reports can be sent in three different formats:

  • HTML

  • Excel

  • PDF


In the Management task area, you will be able to manage your DPM server. In the facets pane you will find three different parts:

  • Agents

  • Disks

  • Libraries

Before DPM can start to protect the server-side production data, a DPM agent must be installed and attached to the DPM server. In the Agents part, you are able to install, update, disable protection, uninstall, throttle, or just refresh your agents by right-clicking the protected server name. You can also install the DPM agents to the production servers from here by clicking on the Install button in the toolbar.

All the information about your DPM storage pool can be found under the disks link in the facets pane. Within the GUI, you are able to add or rescan disks in the disk pool.

Under the Libraries part, you will find the attached tape library or stand-alone tape drive. DPM is not picky regarding the tape vendor of your tape solution. The only important thing to consider is that the tape drives and library media changer is populated correctly in the device manager. A good thing to do is to verify your drivers for your tape solution. If you have Microsoft signed drivers or the vendor has verified that their drivers work with DPM, you are ready to use your new tape solution.


Additional functions and roles within DPM

In this section we will discuss some additional features and roles within DPM that can ease the administrative burden of the IT staff or helpdesk.

End-user Restore Recovery (EUR)

In the first version of DPM that was released in 2006, End-user Restore Recovery (EUR) was introduced. This is a feature that is based on the shadow copy client that is included in the client operating system from Vista and newer. XP can also be protected but you need to install a shadow copy client and also have SP2 installed.

EUR gives the end user the ability to restore previous versions of files on file shares. The EUR function will verify the users restoring files or folders by verifying the Access Control List (ACL) in the NTFS rights of the file or folder.

To enable the EUR feature you must perform an update of the schema in your Active Directory. Keep in mind that those changes are not reversible. If you need to undo the changes made you will need to restore the Active Directory.

DPM Self-service Recovery Tool (SSRT)

SQL server has been around for sometime and with that, the need to restore databases. Many database administrators would like to be able to perform their own database restores without having to contact the restore admin.

With the Self-Service Recovery Tool (SSRT), DBAs can perform the restore process without contacting any restore administrator.

The DPM admin will need to specify the group or user within the Active Directory that will be able to perform a restore and will need to define where the restored databases should be placed. The DBAs will not be able to restore any databases to their original location; just alternative locations or network folders.

The configuration of SSRT will be covered in Chapter 6, DPM-aware Windows Workload Protection.

Single Instance Storage (SIS)

Single Instance Storage (SIS) is a feature included in Window Storage Server 2003 and 2008. SIS is Microsoft's former answer to deduplication of data.

DPM is SIS-aware, which means that DPM can protect SIS-enabled volumes of the Windows Storage Server, which is why the SIS component is installed during the DPM server installation.

DPM cannot leverage SIS by itself to achieve a deduplicated DPM disk pool. If your goal is to have a deduplicated DPM disk pool you can use hardware deduplication of the SAN that stores the DPM disk pool or you can use third-party software such as BitWackr.



After you have deployed your DPM agents, attached them, and created a protection group with data sources, a configuration file is created on the production server. The configuration file is PSDatasourceConfig.XML and we will have a closer look at the file and explain how it is structured.

The PSDataSourceConfig.XML file is the configuration file for the DPM agent that holds the definition of which VSS should be used for what purpose. For example, if you are performing a system state backup of a 2008 R2 Windows Server, the PSDataSourceConfig.XML file instructs the DPM agent to use the VSS with the VSS WriterId tag: 8c3d00f9-3ce9-4563-b373-19837bc2835e.

Let's have a closer look at all the lines in the PSDataSourceConfig.XML file that are associated with the system state backup:

The following are the attributes of the previous screenshot:

  • The WriterId tag identifies the VSS writer that should be used for the VSS process

  • The Version tag of the VSS is the VSS version that is used for the VSS writer

  • The VssWriterInvolved tag identifies any cooperative VSS writers used for the backup process

  • The LogicalPath and ComponentName tags are used by the VSS writer to backup reporting

  • The FilesToProtect tag identifies which files should be included in the system state backup

  • The Size tag in the PSDataSourceConfig.XML file indicates the size that is allocated in KB for the VSS area

  • The UseDRWithCC tag is used for the disaster recovery process


Troubleshooting backups

DPM only uses technologies that are already installed to back up and restore the production servers. It is good to know where the DPM agent and DPM server store the logfiles that are written during this process in case something goes wrong. Understanding the logfiles is critical if you are facing an error, when backing up, that you are unable to resolve.

The local Windows logs

The operating system writes useful troubleshooting information in the local Windows logs regarding what is causing a backup to fail. The Application and the System log will provide somewhat detailed information that will aid you in the troubleshooting process.

Troubleshooting VSS

Since DPM uses the underlying VSS technology, this is the first place you should look for any errors. Since VSS has certain needs, such as enough free disk space on the volumes that the shadow copy operates on or VSS writers that are in a stable state, it is a simple troubleshooting process.

To verify the VSS state, just open a command prompt and type vssadmin list writers. The output will be a list of all the VSS writers present in the operating system and their state. If the state is 1 (stable) or 5 (waiting for completion), everything is normal.

The DPM agent logs

If you don't find anything wrong with the VSS, free disk space, or local windows logs, you should look into the DPM agent's logs if you got an indication that the error resides on the production server. The logfiles reside in the C:\%ProgramFiles%\Microsoft Data Protection Manager\DPM\Temp catalog. The logfiles will log all the processes that are used within the communication between DPM server and DPM agents. On the DPM agent side, the logfiles you will find are as follows:

  • AgentBootstrapperCurr

  • DPMACCurr

  • DPMRACurr

The DPM server logs

On the DPM server side, the logfiles reside in the %ProgramFiles%\Microsoft DPM\DPM\Temp catalog. The logfiles are as follows:

  • AgentBootstrapperCurr

  • AMServiceActivityCurr

  • AMServiceAudit

  • DPMAccessManagerCurr

  • DpmBackupCurr

  • DPMCLI0Curr

  • DPMCLI9Curr

  • DPMRACurr

  • DPMRoleConfiguration0Curr

  • DPMUI0Curr

  • DpmWriterCurr

  • LAAgentCurr

  • MSDPMCurr

These logfiles are very detailed and describe everything that goes on with the DPM server in readable form. Microsoft uses these files to track errors with DPM installations. The logfile that you can initially look into is the DPMRACurr logfile on both the DPM agent and DPM server side. This logfile logs all the processes for the remote agent and, if the DPMRA service has encountered an error, it will be stated in the logfiles as WARNINGFailed or Error.


Upgrading scenarios

When you are facing an upgrade scenario there are some things you must be aware of. Always have a local dump of the DPM database DPMDB. This is achieved by using the cmdlet DPMBACKUP.EXE. If anything goes wrong with the upgrade, you can always restore your DPM server using the previously dumped DPM database.

The upgrade of DPM is very easy. You can upgrade from DPM 2010 to DPM 2012 just by running the installation process initiated from the DPM media and following the wizard.

You cannot upgrade from DPM 2006 or DPM 2007 to DPM 2012. If you would like to benefit from the new features of DPM 2012 you should create a co-existence environment of DPM 2007 and DPM 2012.



In this chapter we looked at DPM and its architecture. We covered the important key features and underlying technologies in the operating system such as VSS, change journal, DPMDB, and file filter, among others. Additionally, we covered the new GUI and looked at some of its new key features.

In the next chapter, we will cover backup strategies and how to design them for use in the real world.

About the Authors

  • Steve Buchanan (MVP)

    Steve Buchanan (MVP), MCSE, ITIL, is a regional solutions director with Concurrency, a five-time Microsoft Cloud and Data Center MVP, and author of several technical books focused on the System Center platform. Steve has been an IT Professional for 17+ years in various positions, ranging from infrastructure architect to IT manager. Steve is focused on digitally transforming IT departments through service management, systems management, and cloud technologies.

    Steve has authored the following books:

    • System Center 2012 Service Manager Unleashed, Sams Publishing
    • Microsoft System Center Data Protection Manager 2012 SP1, Packt
    • Microsoft Data Protection Manager 2010, Packt

    Steve holds the following certifications: A +, Linux +, MCSA, MCITP: Server Administrator, MCSE: Private Cloud, and ITIL 2011 Foundation.

    Steve stays active in the System Center community and enjoys blogging about his adventures in the world of IT at

    Browse publications by this author
  • Islam Gomaa

    Islam Gomaa is a System Architect at Kivuto Solutions Inc, the global leader in complex digital distribution solutions. Islam has over 15 years of expertise in helping organizations align their business goals using Microsoft technology and deploying Microsoft-based solutions, which helped Kivuto become ISO 27001 certified and achieve the Microsoft Gold competency as an ISV. Islam is an SCDM MVP and member of the Windows Springboard Technical Expert Panel (STEP) for Windows 8 and Server 2012, having delivered STEP presentations as an evangelist across Canada and the USA. He has also authored select advanced webcasts on Microsoft private cloud. Islam presented at both TechEd 2013 North America and Europe, and is welcomed each year to present for TechEd and MMS as a guest speaker. Islam has a Bachelor’s in computer science from Montreal University, holds several Microsoft technical designations, and is an active member of the IT community. Islam enjoys sharing his adventures and ideas about system administration through his blog at and

    Browse publications by this author
  • Robert Hedblom

    Robert Hedblom has a well-known System Center and cloud MVP profile. He is often seen as a presenter or speaker at large global events, such as TechEd, Ignite, System Center, and Azure gatherings. Within his line of work at Lumagate, he delivers scenario-based solutions in complex environments, empowering the Cloud OS strategy from Microsoft.

    Backup, restore, and disaster recovery have always been a part of Robert's work, and now, he is trying to make a point that it is not about the implementation of different products that matters anymore, it's all about building cost-effective, optimal, and well-designed services.

    "All good designs come from knowing your data services' dependencies" is what Robert believes in.

    Robert was a coauthor on Microsoft System Center Data Protection Manager 2012 SP1, Packt Publishing, and has also written other books and given training. His expertise is often utilized by Microsoft's MCS and PFE organizations in Europe, and he is an elected member of the development process of the System Center stack and Azure at Microsoft Redmond.

    Browse publications by this author
  • Flemming Riis

    Flemming Riis is an infrastructure consultant at Kompetera with a focus on System Center. He has been working there since 1997 in various roles, starting with repairing PCs and then presales support. He is now a consultant who started with management software, then became Operations Manager, and hasn't looked back since. Flemming is a Microsoft System Center MVP and holds the following certifications: MCP, MCTS, MCSA, and MCITP. Flemming can be found blogging at

    Browse publications by this author
Book Title
Unlock this book and the full library for FREE
Start free trial