Digging into Windows Azure Diagnostics

Exclusive offer: get 50% off this eBook here
Microsoft Windows Azure Development Cookbook

Microsoft Windows Azure Development Cookbook — Save 50%

Over 80 advanced recipes for developing scalable services with the Windows Azure platform with this Microsoft Azure book and eBook

$29.99    $15.00
by Neil Mackenzie | August 2011 | Cookbooks Enterprise Articles Microsoft

A Windows Azure hosted service may comprise multiple instances of multiple roles. These instances all run in a remote Windows Azure data center—typically 24*7. The ability to monitor these instances non-intrusively is essential both in detecting failure and in capacity planning. This article by Neil Mackenzie, author of Microsoft Windows Azure Development Cookbook, shows how Windows Azure Diagnostics provides for the non-intrusive capture of diagnostic data and its subsequent persistence to the Windows Azure Storage Service. Windows Azure Diagnostics supports various standard sources, allowing for their extensibility where appropriate. The topics covered in this article include:

  • Using the Windows Azure Diagnostics trace listener
  • Performing an on-demand transfer
  • Implementing custom logging
  • Accessing data persisted to Windows Azure Storage
  • Using the Windows Azure Platform
  • PowerShell cmdlets to configure Windows Azure Diagnostics

 

Microsoft Windows Azure Development Cookbook

Microsoft Windows Azure Development Cookbook

Over 80 advanced recipes for developing scalable services with the Windows Azure platform

        Read more about this book      

(For more resources on this subject, see here.)

The reader will benefit from the previous article on Windows Azure Diagnostics: Initializing the Configuration and Using a Configuration File

Introduction

Diagnostic data can be used to identify problems with a hosted service. The ability to view the data from several sources and across different instances eases the task of identifying a problem.

Diagnostic data can be used to identify when service capacity is either too high or too low for the expected workload. This can guide capacity decisions such as whether to scale up or down the number of instances.

The configuration of Windows Azure Diagnostics is performed at the instance level. The code to do that configuration is at the role level, but the diagnostics configuration for each instance is stored in individual blobs in a container named wad-control-container located in the storage service account configured for Windows Azure Diagnostics.

There is no need for application data and diagnostics data to be located in the same storage service account. Indeed, a best practice from both security and performance perspectives would be to host application data and diagnostic data in separate storage service accounts.

The configuration of Windows Azure Diagnostics is centered on the concept of data buffers with each data buffer representing a specific type of diagnostic information. Some of the data buffers have associated data sources which represent a further refining of the data captured and persisted. For example, the performance counter data buffer has individual data sources for each configured performance counter. Windows Azure Diagnostics supports record-based data buffers that are persisted to Windows Azure tables and file-based data buffers that are persisted to Windows Azure blobs. In the Accessing data persisted to Windows Azure Storage recipe we see that we can access the diagnostic data in the same way we access other data in Windows Azure storage.

Windows Azure Diagnostics supports the following record-based data buffers:

  • Windows Azure basic logs
  • Performance counters
  • Windows Event Logs
  • Windows Azure Diagnostic infrastructure logs

The Windows Azure basic logs data buffer captures information written to a Windows Azure trace listener. In the Using the Windows Azure Diagnostics trace listener recipe, we see how to configure and use the basic logs data buffer. The performance counters data buffer captures the data of any configured performance counters. The Windows Event Logs data buffer captures the events form any configured Windows Event Log. The Windows Azure Diagnostic infrastructure logs data buffer captures diagnostic data produced by the Windows Azure Diagnostics process.

Windows Azure Diagnostics supports the following file-based data sources for the Directories data buffer:

  • IIS logs
  • IIS Failed Request Logs
  • Crash dumps
  • Custom directories

The Directories data buffer copies new files in a specified directory to blobs in a specified container in the Windows Azure Blob Service. The data captured by IIS Logs, IIS Failed Request Logs, and crash dumps is self-evident. With the custom directories data source, Windows Azure Diagnostics supports the association of any directory on the instance with a specified container in Windows Azure storage. This allows for the coherent integration of third-party logs into Windows Azure Diagnostics. We see how to do this in the Implementing custom logging recipe.

The implementation of Windows Azure Diagnostics was changed in Windows Azure SDK v1.3 and it is now one of the pluggable modules that have to be explicitly imported into a role in the service definition file. As Windows Azure Diagnostics persists both its configuration and data to Windows Azure storage, it is necessary to specify a storage service account for diagnostics in the service configuration file.

The default configuration for Windows Azure Diagnostics captures some data but does not persist it. Consequently, the diagnostics configuration should be modified at role startup. In the Initializing the configuration of Windows Azure Diagnostics recipe, we see how to do this programmatically, which is the normal way to do it. In the Using a configuration file with Windows Azure Diagnostics recipe, we see how to use a configuration file to do this, which is necessary in a VM role.

In normal use, diagnostics data is captured all the time and is then persisted to the storage service according to some schedule. In the event of a problem, it may be necessary to persist diagnostics data before the next scheduled transfer time. We see how to do this in the Performing an on-demand transfer recipe.

Both Microsoft and Cerebrata have released PowerShell cmdlets that facilitate the remote administration of Windows Azure Diagnostics. We see how to do this in the Using the Windows Azure Platform PowerShell cmdlets to configure Windows Azure Diagnostics recipe.

There are times, especially early in the development process, when non-intrusive diagnostics monitoring is not sufficient. In the Using IntelliTrace to Diagnose Problems with a Hosted Service recipe, we see the benefits of intrusive monitoring of a Windows Azure role instance.

Using the Windows Azure Diagnostics trace listener

Windows Azure Diagnostics supports the use of Trace to log messages. The Windows Azure SDK provides the DiagnosticMonitorTraceListener trace listener to capture the messages. The Windows Azure Diagnostics basic logs data buffer is used to configure their persistence to the Windows Azure Table Service.

The trace listener must be added to the Listeners collection for the Windows Azure hosted service. This is typically done through configuration in the appropriate app.config or web.config file, but it can also be done in code. When it creates a worker or web role, the Windows Azure tooling for Visual Studio adds the DiagnosticMonitorTraceListener to the list of trace listeners specified in the Configuration section of the relevant configuration file.

Methods of the System.Diagnostics.Trace class can be used to write error, warning and informational messages. When persisting the messages to the storage service, the Diagnostics Agent can filter the messages if a LogLevel filter is configured for the BasicLogsBufferConfiguration.

The Compute Emulator in the development environment adds an additional trace listener, so that trace messages can be displayed in the Compute Emulator UI.

In this recipe, we will learn how to trace messages using the Windows Azure trace listener.

How to do it...

We are going to see how to use the trace listener provided in the Windows Azure SDK to trace messages and persist them to the storage service. We do this as follows:

  1. Ensure that the DiagnosticMonitorTraceListener has been added to the appropriate configuration file: app.config for a worker role and web.config for a web role.
  2. If necessary, add the following to the Configuration section of app.config or web.config file:

    <system.diagnostics>
    <trace>
    <listeners>
    <add type="Microsoft.WindowsAzure.Diagnostics.
    DiagnosticMonitorTraceListener,
    Microsoft.WindowsAzure.Diagnostics, Version=1.0.0.0,
    Culture=neutral, PublicKeyToken=31bf3856ad364e35"
    name="AzureDiagnostics">
    <filter type="" />
    </add>
    </listeners>
    </trace>
    </system.diagnostics>

  3. Use the following to write an informational message:

    System.Diagnostics.Trace.TraceInformation("Information");

  4. Use the following to write a warning message:

    System.Diagnostics.Trace.Warning("Warning ");

  5. Use the following to write an error message:

    System.Diagnostics.Trace.TraceError("Error");

  6. Ensure that the DiagnosticMonitorConfiguration.Logs property is configured with an appropriate ScheduledTransferPeriod and ScheduledTransferLogLevelFilter when DiagnosticMonitor.Start() is invoked.

How it works...

In steps 1 and 2, we ensure that the DiagnosticMonitorTraceListener is added to the collection of trace listeners for the web role or worker role.

In steps 3 through 5, we see how to write messages to the trace listener.

In step 6, we ensure that the Diagnostic Agent has been configured to persist the messages to the storage service. Note that they can also be persisted through an on-demand transfer. This configuration is described in the recipe Initializing the configuration of Windows Azure Diagnostics.

There's more...

The Windows Azure SDK v1.3 introduced full IIS in place of the hosted web core used previously for web roles. With full IIS, the web role entry point and IIS are hosted in separate processes. Consequently, the trace listener must be configured separately for each process. The configuration using web.config configures the trace listener for IIS, not the web role entry point. Note that Windows Azure Diagnostics needs to be configured only once in each role, even though the trace listener is configured separately in both the web role entry point and in IIS.

The web role entry point runs under a process named WaIISHost.exe. Consequently, one solution is to create a special configuration file for this process named WaIISHost.exe.config and add the trace listener configuration to it.

A more convenient solution is to add the DiagnosticMonitorTraceListener trace listener programmatically to the list of trace listeners for the web role entry point. The following demonstrates an overridden OnStart() method in a web role entry point modified to add the trace listener and write an informational message:

public override bool OnStart()
{
System.Diagnostics.Trace.Listeners.Add(new Microsoft.
WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener());
System.Diagnostics.Trace.AutoFlush = true;
System.Diagnostics.Trace.TraceInformation("Information");

return base.OnStart();
}

The AutoFlush property is set to true to indicate that messages should be flushed through the trace listener as soon as they are written.

Performing an on-demand transfer

The Windows Azure Diagnostics configuration file specifies a schedule in which the various data buffers are persisted to the Windows Azure Storage Service. The on-demand transfer capability in Windows Azure Diagnostics allows a transfer to be requested outside this schedule. This is useful if a problem occurs with an instance and it becomes necessary to look at the captured logs before the next scheduled transfer.

An on-demand transfer is requested for a specific data buffer in a specific instance. This request is inserted into the diagnostics configuration for the instance stored in a blob in wad-control-container. This is an asynchronous operation whose completion is indicated by the insertion of a message in a specified notification queue. The on-demand transfer is configured using an OnDemandTransferOptions instance that specifies the DateTime range for the transfer, a LogLevelFilter that filters the data to be transferred, and the name of the notification queue. The RoleInstanceDiagnosticeManager.BeginOnDemandTransfer() method is used to request the on-demand transfer with the configured options for the specified data buffer.

Following the completion of an on-demand transfer, the request must be removed from the diagnostics configuration for the instance by using the RoleInstanceDiagnosticManager.EndOnDemandTransfer() method. The completion message in the notification queue should also be removed. The GetActiveTransfers() and CancelOnDemandTransfers() methods of the RoleInstanceDiagnosticManager class can be used to enumerate and cancel active on-demand transfers. Note that it is not possible to modify the diagnostics configuration for the instance if there is a current request for an on-demand transfer, even if the transfer has completed.

Note that requesting an on-demand transfer does not require a direct connection with the hosted service. The request merely modifies the diagnostic configuration for the instance. This change is then picked up when the Diagnostic Agent on the instance next polls the diagnostic configuration for the instance. The default value for this polling interval is 1 minute. This means that a request for an on-demand transfer needs to be authenticated only against the storage service account containing the diagnostic configuration for the hosted service.

In this recipe, we will learn how to request an on-demand transfer and clean up after it completes.

How to do it...

We are going to see how to request an on-demand transfer and clean up after it completes. We do this as follows:

  1. Use Visual Studio to create a WPF project.
  2. Add the following assembly references to the project:

    Microsoft.WindowsAzure.Diagnostics.dll
    Microsoft.WindowsAzure.ServiceRuntime.dll
    Microsoft.WindowsAzure.StorageClient.dll
    System.configuration.dll

  3. Add a class named OnDemandTransferExample to the project.
  4. Add the following using statements to the class:

    using Microsoft.WindowsAzure;
    using Microsoft.WindowsAzure.Diagnostics;
    using Microsoft.WindowsAzure.Diagnostics.Management;
    using Microsoft.WindowsAzure.ServiceRuntime;
    using Microsoft.WindowsAzure.StorageClient;
    using System.Configuration;

  5. Add the following private member to the class:

    String wadNotificationQueueName = "wad-transfer-queue";

  6. Add the following method, requesting an on-demand transfer, to the class:

    public void RequestOnDemandTransfer(
    String deploymentId, String roleName, String roleInstanceId)
    {
    CloudStorageAccount cloudStorageAccount =
    CloudStorageAccount.Parse(
    ConfigurationManager.AppSettings[
    "DiagnosticsConnectionString"]);

    OnDemandTransferOptions onDemandTransferOptions =
    new OnDemandTransferOptions()
    {
    From = DateTime.UtcNow.AddHours(-1),
    To = DateTime.UtcNow,
    LogLevelFilter =
    Microsoft.WindowsAzure.Diagnostics.LogLevel.Verbose,
    NotificationQueueName = wadNotificationQueueName
    };

    RoleInstanceDiagnosticManager ridm =
    cloudStorageAccount.CreateRoleInstanceDiagnosticManager(
    deploymentId, roleName, roleInstanceId);

    IDictionary<DataBufferName, OnDemandTransferInfo>
    activeTransfers = ridm.GetActiveTransfers();
    if (activeTransfers.Count == 0)
    {
    Guid onDemandTransferId = ridm.BeginOnDemandTransfer(
    DataBufferName.PerformanceCounters,
    onDemandTransferOptions);
    }
    }

  7. Add the following method, cleaning up after an on-demand transfer, to the class:

    public void CleanupOnDemandTransfers()
    {
    CloudStorageAccount cloudStorageAccount =
    CloudStorageAccount.Parse(
    ConfigurationManager.AppSettings[
    "DiagnosticsConnectionString"]);
    CloudQueueClient cloudQueueClient =
    cloudStorageAccount.CreateCloudQueueClient();

    CloudQueue cloudQueue = cloudQueueClient.GetQueueReference(
    wadNotificationQueueName);
    CloudQueueMessage cloudQueueMessage;
    while ((cloudQueueMessage = cloudQueue.GetMessage())
    != null)
    {
    OnDemandTransferInfo onDemandTransferInfo =
    OnDemandTransferInfo.FromQueueMessage(
    cloudQueueMessage);
    String deploymentId = onDemandTransferInfo.DeploymentId;
    String roleName = onDemandTransferInfo.RoleName;
    String roleInstanceId =
    onDemandTransferInfo.RoleInstanceId;
    Guid requestId = onDemandTransferInfo.RequestId;

    RoleInstanceDiagnosticManager ridm =
    cloudStorageAccount.CreateRoleInstanceDiagnosticManager(
    deploymentId, roleName, roleInstanceId);
    Boolean result = ridm.EndOnDemandTransfer(requestId);
    cloudQueue.DeleteMessage(cloudQueueMessage);
    }
    }

  8. Add the following Grid declaration to the Window element of MainWindow.xaml:

    <Grid>
    <Label Content="DeploymentId:" Height="28"
    HorizontalAlignment="Left"
    VerticalAlignment="Top" Margin="30,60,0,0"
    Name="label1" />
    <Label Content="Role name:" Height="28"
    HorizontalAlignment="Left" VerticalAlignment="Top"
    Margin="30,110,0,0" Name="label2" />
    <Label Content="Instance Id:" Height="28"
    HorizontalAlignment="Left" VerticalAlignment="Top"
    Margin="30,160,0,0" Name="label3" />
    <TextBox HorizontalAlignment="Left" VerticalAlignment="Top"
    Margin="120,60,0,0" Name="DeploymentId" Height="23"
    Width="120" Text="24447326eed3475ca58d01c223efb778" />
    <TextBox HorizontalAlignment="Left" VerticalAlignment="Top"
    Margin="120,110,0,0" Width="120" Name="RoleName"
    Text="WebRole1" />
    <TextBox Height="23" HorizontalAlignment="Left"
    VerticalAlignment="Top" Margin="120,160,0,0" Width="120"
    Name="InstanceId" Text="WebRole1_IN_0" />
    <Button Content="Request On-Demand Transfer" Height="23"
    HorizontalAlignment="Left" VerticalAlignment="Top"
    Margin="60,220,0,0" Width="175" Name="RequestTransfer"
    Click="RequestTransfer_Click" />
    <Button Content="Cleanup On-Demand Transfers" Height="23"
    HorizontalAlignment="Left" VerticalAlignment="Top"
    Margin="300,220,0,0" Width="175" Name="CleanupTransfers"
    Click="CleanupTransfers_Click" />
    </Grid>

  9. Add the following event handler to MainWindow.xaml.cs:

    private void RequestTransfer_Click(
    object sender, RoutedEventArgs e)
    {
    String deploymentId = DeploymentId.Text;
    String roleName = RoleName.Text;
    String roleInstanceId = InstanceId.Text;

    OnDemandTransferExample example =
    new OnDemandTransferExample();
    example.RequestOnDemandTransfer(
    deploymentId, roleName, roleInstanceId);
    }

  10. Add the following event handler to MainWindow.xaml.cs:

    private void CleanupTransfers_Click(
    object sender, RoutedEventArgs e)
    {
    OnDemandTransferExample example =
    new OnDemandTransferExample();
    example.CleanupOnDemandTransfers();
    }

  11. Add the following to the configuration element of app.config:

    <appSettings>
    <add key="DiagnosticsConnectionString"
    value="DefaultEndpointsProtocol=https;AccountName={
    ACCOUNT_NAME};AccountKey={ACCESS_KEY}"/>
    </appSettings>

How it works...

We create a WPF project in step 1 and add the required assembly references in step 2.

We set up the OnDemandTransferExample class in steps 3 and 4. We add a private member to hold the name of the Windows Azure Diagnostics notification queue in step 5.

In step 6, we add a method requesting an on-demand transfer. We create an OnDemandTransferOptions object configuring an on-demand transfer for data captured in the last hour. We provide the name of the notification queue Windows Azure Diagnostics inserts a message indicating the completion of the transfer. We use the deployment information captured in the UI to create a RoleInstanceDiagnosticManager instance. If there are no active on-demand transfers, then we request an on-demand transfer for the performance counters data buffer.

In step 7, we add a method cleaning up after an on-demand transfer. We create a CloudStorageAccount object that we use to create the CloudQueueClient object with which we access to the notification queue. We then retrieve the transfer-completion messages in the notification queue. For each transfer-completion message found, we create an OnDemandTransferInfo object describing the deploymentID, roleName, instanceId, and requestId of a completed on-demand transfer. We use the requestId to end the transfer and remove it from the diagnostics configuration for the instance allowing on-demand transfers to be requested. Finally, we remove the notification message from the notification queue.

In step 8, we add the UI used to capture the deployment ID, role name, and instance ID used to request the on-demand transfer. We can get this information from the Windows Azure Portal or the Compute Emulator UI. This information is not needed for cleaning up on-demand transfers, which uses the transfer-completion messages in the notification queue.

In steps 9 and 10, we add the event handlers for the Request On-Demand Transfer and Cleanup On-Demand Transfers buttons in the UI. These methods forward the requests to the methods we added in steps 6 and 7.

In step 11, we add the DiagnosticsConnectionString to the app.config file. This contains the connection string used to interact with the Windows Azure Diagnostics configuration. We must replace {ACCOUNT_NAME} and {ACCESS_KEY} with the storage service account name and access key for the storage account in which the Windows Azure Diagnostics configuration is located.

Microsoft Windows Azure Development Cookbook Over 80 advanced recipes for developing scalable services with the Windows Azure platform with this Microsoft Azure book and eBook
Published: August 2011
eBook Price: $29.99
Book Price: $49.99
See more
Select your format and quantity:
        Read more about this book      

(For more resources on this subject, see here.)

Implementing custom logging

Windows Azure Diagnostics can be used to persist third-party log files in the same way it persists IIS logs, failed request logs, and crash dumps. These are all configured through the directories data buffer.

The directories data buffer comprises a set of data sources. These are instances of the DirectoryConfiguration class that exposes the following properties:

  • Container
  • DirectoryQuotaInMB
  • Path

A data source maps a path in the local file system with a container in the Windows Azure Blob Service. The DirectoryQuotaInMB reserves a specified amount of space in local storage for the specified data source. A scheduled transfer period is specified at the level of the directories data buffer. The Diagnostics Agent persists to the configured container any files added to the specified path since the last transfer.

In this recipe, we will learn how to use Windows Azure Diagnostics to persist custom logs to the Windows Azure Blob Service.

How to do it...

We are going to see how to configure custom logging with Windows Azure Diagnostics. We do this as follows:

  1. Use Visual Studio to create an empty cloud project.
  2. Add a worker role to the project (accept default name of WorkerRole1).
  3. Add the following local storage definition, as a child of the WorkerRole element, to ServiceDefinition.csdef:

    <LocalResources>
    <LocalStorage name="CustomLoggingLocation"
    sizeInMB="100" cleanOnRoleRecycle="false"/>
    </LocalResources>

  4. Add the following using statement to WorkerRole.cs:

    using System.IO;

  5. Add the following private member to the WorkerRole class:

    private String localResourceName = "CustomLoggingLocation";

  6. In the WorkerRole class, replace the Run() method with the following:

    public override void Run()
    {
    Trace.WriteLine("WorkerRole1 entry point called",
    "Information");

    while (true)
    {
    Thread.Sleep(10000);
    Trace.WriteLine("Working", "Information");

    CreateLogFile();
    }
    }

  7. In the WorkerRole class, replace OnStart() with the following:

    public override bool OnStart()
    {
    ServicePointManager.DefaultConnectionLimit = 12;

    InitializeWadConfiguration();

    return base.OnStart();
    }

  8. Add the following method, configuring Windows Azure Diagnostics, to the WorkerRole class:

    private void InitializeWadConfiguration()
    {
    String wadConnectionString =
    "Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString";
    String customContainerName = "wad-custom-container";

    DiagnosticMonitorConfiguration dmc =
    DiagnosticMonitor.GetDefaultInitialConfiguration();

    LocalResource localResource =
    RoleEnvironment.GetLocalResource(localResourceName);
    String logPath =
    Path.Combine(localResource.RootPath, "Logs");

    DirectoryConfiguration directoryConfiguration =
    new DirectoryConfiguration()
    {
    Container = customContainerName,
    DirectoryQuotaInMB =
    localResource.MaximumSizeInMegabytes,
    Path = logPath
    };
    dmc.Directories.DataSources.Add(directoryConfiguration);
    dmc.Directories.ScheduledTransferPeriod =
    TimeSpan.FromHours(1);

    dmc.Logs.BufferQuotaInMB = 100;
    dmc.Logs.ScheduledTransferPeriod = TimeSpan.FromHours(1);
    dmc.Logs.ScheduledTransferLogLevelFilter =
    LogLevel.Verbose;

    CloudStorageAccount cloudStorageAccount =
    CloudStorageAccount.Parse(
    RoleEnvironment.GetConfigurationSettingValue(
    wadConnectionString));
    DiagnosticMonitor.Start(wadConnectionString, dmc);
    }

  9. Add the following method, writing a file in the custom logging directory, to the WorkerRole class:

    private void CreateLogFile()
    {
    LocalResource localResource =
    RoleEnvironment.GetLocalResource(localResourceName);
    String logPath =
    Path.Combine(localResource.RootPath, "Logs");
    String fileName =
    Path.Combine(logPath, Path.GetRandomFileName());

    if (!Directory.Exists(logPath))
    {
    Directory.CreateDirectory(logPath);
    }

    using (StreamWriter streamWriter =
    new StreamWriter(fileName))
    {
    streamWriter.Write("If we shadows have offended");
    }
    }

How it works...

In steps 1 and 2, we create a cloud project with a worker role.

In step 3, we add the definition of the local storage, used for the custom log files, to the service definition file for the hosted service. We provide a name by which it can be referenced and a size. We also specify that the content of local storage should be preserved through an instance recycle.

In step 4, we add the using statement required for file handling. In step 5, we add a private member to store the name of the local resource. In step 6, we replace the existing Run() method with one that creates a log file every 10 seconds to simulate actual logging. In step 7, we replace the OnStart() method with one that configures Windows Azure Diagnostics.

In step 8, we configure Windows Azure Diagnostics to support custom logging. We create the full path to the directory where we store the logs—the Logs directory under the RootPath for the local storage resource. Then, we create and configure the DirectoryConfiguration data source we use to map the log directory to the container in the Blob service into which the Diagnostics Agent transfers the files as blobs. We then add the data source to the Directories data buffer and specify a scheduled transfer period. We also configure the Logs data buffer. Finally, we invoke Start() to update the configuration of Windows Azure Diagnostics.

In step 9, we add a simple method that writes a randomly named file to the custom logging directory.

Accessing data persisted to Windows Azure Storage

Windows Azure Diagnostics captures diagnostic information for an instance then persists it to the Windows Azure Storage Service. The persistence location depends on the data buffer.

The Directories data buffer is configured as a set of data sources each of which maps a path on the local file system with a container in the Windows Azure Blob Service. The Diagnostics Agent persists files in that path as blobs in the configured container. Note that the Diagnostics Agent inserts a record in the WADDirectoriesTable each time it persists a file to a container.

The Diagnostics Agent persists the data in the other data buffers as records in the Windows Azure Table Service. The following tables are used:

Microsoft Windows Azure Development tutorials

The tables are partitioned by minute. Specifically, when a record is inserted in a table, the PartitionKey is set to the Tick count of the current UTC time with the seconds discarded, with the entire value prepended by a 0. Discarding the seconds has the effect of setting the last 8 characters of the PartitionKey to 0. The RowKey combines the deployment ID, the role name, and the instance ID along with a key to ensure uniqueness. The Timestamp represents the time the event was inserted in the table.

While each table contains some properties specific to the data being logged, all of them contain the following properties:

  • EventTickCount
  • DeploymentId
  • Role
  • RoleInstance

The EventTickCount is an Int64 representing the time the event was generated, to an accuracy of 100 nanoseconds. The DeploymentId identifies the specific deployment, while the Role and RoleInstance specify the role instance which generated the event.

The WADPerformanceCountersTable, for example, contains the following additional properties:

  • CounterName
  • CounterValue

The Windows Azure Diagnostics tables can be queried just like any other table in the Table service. A model class derived from TableServiceEntity can be created from the properties of interest. As the only index on a table is on PartitionKey and RowKey, it is important that the PartitionKey, rather than the Timestamp or EventTickCount, be used for time-dependent queries. An appropriate value for the PartitionKey can be created from a DateTime. Unless strict equality is desired, it is not necessary to mimic the construction of the PartitionKey by setting the last eight characters to 0.

In this recipe, we will learn how to query data that Windows Azure Diagnostics has persisted to the Table service.

How to do it...

We are going to see how to query performance counter data that Windows Azure Diagnostics persisted in the WADPerformanceCountersTable table. We do this as follows:

  1. Use Visual Studio to create an empty cloud project.
  2. Add an ASP.NET web role to the project (accept default name of WebRole1).
  3. Add the following assembly reference to the project:

    System.Data.Services.Client

  4. In the WebRole class, replace OnStart() with the following:

    public override bool OnStart()
    {
    ConfigureDiagnostics();

    return base.OnStart();
    }

  5. Add the following method, configuring Windows Azure Diagnostics, to the WebRole class:

    private void ConfigureDiagnostics()
    {
    String wadConnectionString =
    "Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString";

    CloudStorageAccount cloudStorageAccount =
    CloudStorageAccount.Parse(
    RoleEnvironment.GetConfigurationSettingValue(
    wadConnectionString));

    DiagnosticMonitorConfiguration dmc =
    DiagnosticMonitor.GetDefaultInitialConfiguration();

    PerformanceCounterConfiguration pmc =
    new PerformanceCounterConfiguration()
    {
    CounterSpecifier =
    @"\Processor(_Total)\% Processor Time",
    SampleRate = System.TimeSpan.FromSeconds(10)
    };
    dmc.PerformanceCounters.DataSources.Add( pmc);
    dmc.PerformanceCounters.BufferQuotaInMB = 100;
    dmc.PerformanceCounters.ScheduledTransferPeriod =
    TimeSpan.FromMinutes(1);

    DiagnosticMonitor.Start(cloudStorageAccount, dmc);
    }

  6. In the Default.aspx file, replace the asp:Content element named BodyContent with the following:

    <asp:Content ID="BodyContent" runat="server"
    ContentPlaceHolderID="MainContent">
    <asp:GridView ID="GridView1" runat="server"
    AutoGenerateColumns="false">
    <Columns>
    <asp:BoundField DataField="RoleInstance"
    HeaderText="Role Instance" />
    <asp:BoundField DataField="CounterName"
    HeaderText="Counter Name" />
    <asp:BoundField DataField="CounterValue"
    HeaderText="Counter Value" />
    <asp:BoundField DataField="EventDateTime"
    HeaderText="Event DateTime" />
    </Columns>
    </asp:GridView>
    </asp:Content>

  7. Add the following using statements to the Default.aspx.cs file:

    using Microsoft.WindowsAzure;
    using Microsoft.WindowsAzure.ServiceRuntime;
    using Microsoft.WindowsAzure.StorageClient;

  8. In the Default.aspx.cs file, add the following private member to the _Default class:

    private String wadConnectionString = "Microsoft.WindowsAzure.
    Plugins.Diagnostics.ConnectionString";

  9. In the Default.aspx.cs file, replace Page_Load() with the following:

    protected void Page_Load(object sender, EventArgs e)
    {
    CloudStorageAccount cloudStorageAccount =
    CloudStorageAccount.Parse(
    RoleEnvironment.GetConfigurationSettingValue(
    wadConnectionString));
    CloudTableClient cloudTableClient =
    cloudStorageAccount.CreateCloudTableClient();

    DateTime now = DateTime.UtcNow;
    DateTime fiveMinutesAgo = now.AddMinutes(-5);
    String partitionKeyNow =
    String.Format("0{0}", now.Ticks.ToString());
    String partitionKey5MinutesAgo =
    String.Format("0{0}", fiveMinutesAgo.Ticks.ToString());

    TableServiceContext tableServiceContext =
    cloudTableClient.GetDataServiceContext();
    CloudTableQuery<WadPerformanceCountersTable>
    cloudTableQuery =
    (from entity in tableServiceContext.CreateQuery<WadPerformanc
    eCountersTable>(WadPerformanceCountersTable.Name)
    where entity.PartitionKey.CompareTo(partitionKeyNow) < 0
    && entity.PartitionKey.CompareTo(partitionKey5MinutesAgo)
    > 0
    select entity).AsTableServiceQuery();

    GridView1.DataSource = cloudTableQuery;
    GridView1.DataBind();
    }

  10. Add a model class to the project and name it WadPerformanceCountersTable.
  11. Replace the declaration of the WadPerformanceCountersTable class with the following:

    public class WadPerformanceCountersTable
    {
    public static String Name = "WadPerformanceCountersTable";

    public String PartitionKey { get; set; }
    public String RowKey { get; set; }
    public Int64 EventTickCount { get; set; }
    public String DeploymentId { get; set; }
    public String Role { get; set; }
    public String RoleInstance { get; set; }
    public String CounterName { get; set; }
    public Double CounterValue { get; set; }

    public DateTime EventDateTime
    {
    get { return new DateTime(EventTickCount); }
    }
    }

  12. Build the project and run it in the Windows Azure Compute Emulator.
  13. Refresh the Default.aspx page several times over the next few minutes until the page displays some performance counter data.

How it works...

In steps 1 and 2, we create a cloud project with a worker role. We add the required assembly reference in step 3.

In steps 4 and 5, we configure Windows Azure Diagnostics. Specifically, we create a PerformanceCounterConfiguration instance for a performance counter for total processor time with a sampling periodicity of 10 seconds. We add the PerformanceCounterConfiguration to the Windows Azure Diagnostics PerformanceCounters data buffer specifying that it should be persisted once a minute. Finally, we update Windows Azure Diagnostics with this configuration.

In step 6, we add a GridView to contain the performance counter data we retrieve from the Windows Azure Diagnostics tables in the Windows Azure Table Service.

In steps 7 through 9, we modify the Default.aspx.cs to retrieve the data we display in the GridView. In step 7, we add the required using statements. In step 8, we add a private member to store the name of the connection string setting in the service configuration file, ServiceConfiguration.cscfg. In step 9, we initialize the CloudStorageAccount and CloudTableClient objects we use to access the storage service. We initialize two String objects representing the PartitionKey representation of the UTC time now and five minutes ago. Note that we prepend the String objects with a 0, as required by the format used in the Windows Azure Diagnostics table. We then create a CloudTableQuery object to query the WADPerformanceCountersTable and retrieve data from the last 5 minutes, and set the query as a DataSource for the GridView and bind the data to the GridView.

In steps 10 and 11, we create the model class we use to contain the results of querying the WadPerformanceCountersTable that Windows Azure Diagnostics persists diagnostics data in.

In steps 12 and 13, we run the web role and wait for Windows Azure Diagnostics to persist data to the Storage Emulator. This data is then displayed in the Default.aspx web page.

Microsoft Windows Azure Development Cookbook Over 80 advanced recipes for developing scalable services with the Windows Azure platform with this Microsoft Azure book and eBook
Published: August 2011
eBook Price: $29.99
Book Price: $49.99
See more
Select your format and quantity:
        Read more about this book      

(For more resources on this subject, see here.)

Using the Windows Azure Platform PowerShell cmdlets to configure Windows Azure Diagnostics

The Windows Azure team has created a set of PowerShell cmdlets (the Windows Azure Platform PowerShell cmdlets) that can be used to manage Windows Azure hosted services and storage. These cmdlets primarily wrap the Windows Azure Service Management REST API. However, some of the cmdlets use the Windows Azure Diagnostics API to support the remote configuration of Windows Azure Diagnostics.

The cmdlets provide get and set operations for each of the diagnostics data buffers. They also provide operations to request and end an on-demand transfer.

In this recipe, we will learn how to use the Windows Azure Platform PowerShell cmdlets to configure Windows Azure Diagnostics.

Getting ready

If necessary, we can download PowerShell 2 from the Microsoft download center at the following URL:

http://www.microsoft.com/download/en/details.aspx?id=11829

We need to download and install the Windows Azure Platform PowerShell Cmdlets. The package with the cmdlets can be downloaded from the following URL:

http://wappowershell.codeplex.com/

Once the package has been downloaded, the cmdlets need to be built and installed. The installed package contains a StartHere file explaining the process.

How to do it...

We are going to modify the configuration of the Windows Azure basic logs data buffer. We do this as follows:

  1. Create a PowerShell script named Set-WindowsAzureBasicLogs.ps1 and insert the following text:

    $account = "{STORAGE_ACCOUNT}"
    $key = "{STORAGE_ACCOUNT_KEY}"
    $deploymentId = "{DEPLOYMENT_ID}"
    $roleName = "{WEB_ROLE_NAME}"
    $bufferQuotaInMB = 100
    $transferPeriod = 60
    $logLevelFilter = 5

    Add-PSSnapin AzureManagementToolsSnapIn

    Set-WindowsAzureLog -BufferQuotaInMB $bufferQuotaInMB
    -TransferPeriod $transferPeriod
    -LogLevelFilter $logLevelFilter
    -DeploymentId $deploymentId -RoleName $roleName
    -StorageAccountname $account -StorageAccountKey $key

  2. Launch PowerShell.
  3. Navigate to the directory containing Set-WindowsAzureBasicLogs.ps1.
  4. Invoke the cmdlet to modify the configuration:

    .\SetWindowsAzureLogs.ps1

How it works...

In step 1, we create the PowerShell script that changes the configuration of the Windows Azure Diagnostics basic logs data buffer. This is configured in code through the BasicLogsBufferConfiguration class. In the diagnostics configuration for an instance stored in wad-control-container, it is specified in the Logs element. In the Windows Azure Service Management cmdlets, it is specified by WindowsAzureLog.

In the script, we must provide actual values for the {STORAGE_ACCOUNT}, {STORAGE_ACCOUNT_KEY}, {DEPLOYMENT_ID}, and {WEB_ROLE_NAME}. Note that we don't need to specify the instance ID, as the PowerShell cmdlet updates all instances of the role.

We add the AzureManagementToolsSnapin to make the cmdlets available. The final action of step 1 is the actual invocation of the cmdlet. We provide desired values for the size of the data buffer, the scheduled transfer period, and the log-level filter. Verbose has a value of 5, Information has a value of 4, and so on.

In steps 2 and 3, we set up PowerShell.

In step 4, we invoke our script using a .\ syntax to demonstrate that we want to invoke an unsigned script in the current directory.

Azure management cmdlets

Cerebrata has released a commercially supported set of Windows Azure Management Cmdlets that are more extensive than the Windows Azure Platform PowerShell cmdlets. Using the Cerebrata version, the Windows Azure Diagnostics basic logs buffer can be configured using the following script:

$account = "{STORAGE_ACCOUNT}"
$key = "{STORAGE_ACCOUNT_KEY}"
$deploymentId = "{DEPLOYMENT_ID}"
$roleName = "{WEB_ROLE_NAME}"
$instanceId = "{INSTANCE_ID}";
$bufferQuotaInMB = 100
$scheduledTransferPeriod = 60
$logLevelFilter = "Verbose"

Add-PSSnapin AzureManagementCmdletsSnapIn

Set-WindowsAzureLog -BufferQuotaInMB $bufferQuotaInMB -
ScheduledTransferPeriod $scheduledTransferPeriod -LogLevelFilter
$logLevelFilter -DeploymentId $deploymentId -RoleName $roleName
-InstanceId $instanceId -AccountName $account -AccountKey $key

In the script, we must provide actual values for {STORAGE_ACCOUNT}, {STORAGE_ACCOUNT_KEY}, {DEPLOYMENT_ID}, {WEB_ROLE_NAME}, and {INSTANCE_ID}. Note that the Cerebrata cmdlet updates the diagnostic configuration for an individual instance whereas the Windows Azure team version updates the diagnostic configuration for all instances. The Cerebrata version also provides a more detailed feedback while executing.

Summary

In this article we saw how Windows Azure Diagnostics provides for the non-intrusive capture of diagnostic data and its subsequent persistence to the Windows Azure Storage Service.


Further resources on this subject:


About the Author :


Neil Mackenzie

Neil Mackenzie has worked with computers for nearly three decades. He started his computer career doing large-scale numerical simulations for scientific research and business planning. Since then, he has been involved primarily in healthcare software, developing electronic medical records systems. He has been using Windows Azure since PDC 2008 and has used nearly all parts of the Windows Azure platform – including those that no longer exist. Neil is very active in the online Windows Azure community – in particular, helping to solve many of the questions raised in the MSDN Windows Azure Forums. He is a Microsoft MVP for Windows Azure.

Books From Packt


Microsoft SQL Azure Enterprise Application Development
Microsoft SQL Azure Enterprise Application Development

Getting Started with Microsoft Application Virtualization 4.6
Getting Started with Microsoft Application Virtualization 4.6

Microsoft Azure: Enterprise Application Development
Microsoft Azure: Enterprise Application Development

Applied Architecture Patterns on the Microsoft Platform
Applied Architecture Patterns on the Microsoft Platform

Microsoft Windows Communication Foundation 4.0 Cookbook for Developing SOA Applications
Microsoft Windows Communication Foundation 4.0 Cookbook for Developing SOA Applications

Microsoft Silverlight 4 and Windows Azure Enterprise Integration: RAW
Microsoft Silverlight 4 and Windows Azure Enterprise Integration: RAW

Microsoft SharePoint 2010 and Windows PowerShell 2.0: Expert Cookbook
Microsoft SharePoint 2010 and Windows PowerShell 2.0: Expert Cookbook

Least Privilege Security for Windows 7, Vista and XP
Least Privilege Security for Windows 7, Vista and XP


No votes yet

Post new comment

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.
y
5
A
Q
U
F
Enter the code without spaces and pay attention to upper/lower case.
Code Download and Errata
Packt Anytime, Anywhere
Register Books
Print Upgrades
eBook Downloads
Video Support
Contact Us
Awards Voting Nominations Previous Winners
Judges Open Source CMS Hall Of Fame CMS Most Promising Open Source Project Open Source E-Commerce Applications Open Source JavaScript Library Open Source Graphics Software
Resources
Open Source CMS Hall Of Fame CMS Most Promising Open Source Project Open Source E-Commerce Applications Open Source JavaScript Library Open Source Graphics Software