Azure Serverless Computing Cookbook - Third Edition

4 (1 reviews total)
By Praveen Kumar Sreeram
    Advance your knowledge in tech with a Packt subscription

  • Instant online access to over 7,500+ books and videos
  • Constantly updated with 100+ new titles each month
  • Breadth and depth in over 1,000+ technologies
  1. 1. Accelerating cloud app development using Azure Functions

About this book

This third edition of Azure Serverless Computing Cookbook guides you through the development of a basic back-end web API that performs simple operations, helping you understand how to persist data in Azure Storage services. You'll cover the integration of Azure Functions with other cloud services, such as notifications (SendGrid and Twilio), Cognitive Services (computer vision), and Logic Apps, to build simple workflow-based applications.

With the help of this book, you'll be able to leverage Visual Studio tools to develop, build, test, and deploy Azure functions quickly. It also covers a variety of tools and methods for testing the functionality of Azure functions locally in the developer's workstation and in the cloud environment. Once you're familiar with the core features, you'll explore advanced concepts such as durable functions, starting with a "hello world" example, and learn about the scalable bulk upload use case, which uses durable function patterns, function chaining, and fan-out/fan-in.

By the end of this Azure book, you'll have gained the knowledge and practical experience needed to be able to create and deploy Azure applications on serverless architectures efficiently.

Publication date:
June 2020
Publisher
Packt
Pages
458
ISBN
9781800206601

 

1. Accelerating cloud app development using Azure Functions

In this chapter, we'll cover the following recipes:

  • Building a back-end web API using HTTP triggers
  • Persisting employee details using Azure Table storage output bindings
  • Saving profile picture paths to queues using queue output bindings
  • Storing images in Azure Blob Storage
  • Resizing an image using an ImageResizer trigger
 

Introduction

Every software application requires back-end components that are responsible for taking care of the business logic and storing data in some kind of storage, such as databases and filesystems. Each of these back-end components can be developed using different technologies. Azure serverless technology allows us to develop these back-end APIs using Azure Functions.

Azure Functions provides many out-of-the-box templates that solve most common problems, such as connecting to storage and building web APIs. In this chapter, you'll learn how to use these built-in templates. Along with learning about concepts related to Azure serverless computing, we'll also implement a solution to the basic problem domain of creating the components required for an organization to manage internal employee information.

Figure 1.1 highlights the key processes that you will learn about in this chapter:

Key processes to be accomplished in the chapter
Figure 1.1: The key processes

Let's go through a step-by-step explanation of the figure to get a better understanding:

  1. Client call to the API.
  2. Persist employee details using Azure Table Storage.
  3. Save profile picture links to queues.
  4. Invoke a queue trigger as soon as a message is created.
  5. Create the blobs in Azure Blob Storage.
  6. Invoke the blob trigger as soon as a blob is created.
  7. Resize the image and store it in Azure Blob Storage.

We'll leverage Azure Functions' built-in templates using HTTP triggers, with the goal of resizing and storing images in Azure Blob Storage.

 

Building a back-end web API using HTTP triggers

In this recipe, we'll use Azure's serverless architecture to build a web API using HTTP triggers. These HTTP triggers could be consumed by any front-end application that is capable of making HTTP calls.

Getting ready

Let's start our journey of understanding Azure serverless computing using Azure Functions by creating a basic back-end web API that responds to HTTP requests:

How to do it…

Perform the following steps to build a web API using HTTP triggers:

  1. Navigate to the Function App listing page by clicking on the Function Apps menu, which is available on the left-hand side.
  2. Create a new function by clicking on the + icon:
    Adding a new function
    Figure 1.2: Adding a new function
  3. You'll see the Azure Functions for .NET - getting started page, which prompts you to choose the type of tools based on your preference. For the initial few chapters, we'll use the In-portal option, which can quickly create Azure Functions right from the portal without making use of any tools. However, in the coming chapters, we'll make use of Visual Studio and Azure Functions Core Tools to create these functions:
    The Azure portal enables you to choose the development environment. Choose In-Portal to develop from within the Azure portal

    Figure 1.3: Choosing the development environment
  4. In the next step, select More templates… and click on Finish and view templates, as shown in Figure 1.4:
    Choosing More templates… and clicking Finish to view templates
    Figure 1.4: Choosing More templates… and clicking Finish and view templates
  5. In the Choose a template below or go to the quickstart section, choose HTTP trigger to create a new HTTP trigger function:
    Choosing the HTTP Trigger template to create a new function
    Figure 1.5: The HTTP trigger template
  6. Provide a meaningful name. For this example, I have used RegisterUser as the name of the Azure function.
  7. In the Authorization level drop-down menu, choose the Anonymous option. You'll learn more about all the authorization levels in Chapter 9, Configuring security for Azure Functions:
    Selecting the Anonymous option in the Authorization Level
    Figure 1.6: Selecting the authorization level
  8. Click on the Create button to create the HTTP trigger function.
  9. Along with the function, all the required code and configuration files will be created automatically and the run.csx file with editable code will get opened. Remove the default code and replace it with the following code. In the following example, we'll add two parameters (firstname and lastname), which will be displayed in the output as a result of triggering the HTTP trigger:
    #r "Newtonsoft.Json" using System.Net;
    using Microsoft.AspNetCore.Mvc;
    using Microsoft.Extensions.Primitives; using Newtonsoft.Json;
    public static async Task<IActionResult> Run(
    HttpRequest req, ILogger log)
    #r "Newtonsoft.Json"
    using System.Net;
    using Microsoft.AspNetCore.Mvc;
    using Microsoft.Extensions.Primitives;
    using Newtonsoft.Json;
    public static async Task<IActionResult> Run(HttpRequest req, ILogger log)
    {
        log.LogInformation("C# HTTP trigger function processed a request.");
        string firstname=null,lastname = null; 
        string requestBody = await new 
        StreamReader(req.Body).ReadToEndAsync();
        dynamic inputJson = JsonConvert.DeserializeObject(requestBody); 
        firstname =    firstname ?? inputJson?.firstname;
        lastname = inputJson?.lastname;
        return (lastname + firstname) != null
        ? (ActionResult)new OkObjectResult($"Hello, {firstname + " " + lastname}")
        : new BadRequestObjectResult("Please pass a name on the query" + "string or in the request body");
    }
  10. Save the changes by clicking on the Save button available just above the code editor.
  11. Let's try testing the RegisterUser function using the test console. Click on the Test tab to open the test console:
    Testing the HTTP Trigger
    Figure 1.7: Testing the HTTP trigger
  12. Enter the values for firstname and lastname in the Request body section:
    Testing the HTTP Trigger with input data
    Figure 1.8: Testing the HTTP trigger with input data
  13. Make sure that you select POST in the HTTP method drop-down box.
  14. After reviewing the input parameters, click on the Run button available at the bottom of the test console:
    HTTP Trigger execution and output
    Figure 1.9: HTTP trigger execution and output
  15. If the input request workload is passed correctly with all the required parameters, you'll see Status: 200 OK, and the output in the output window will be as shown in Figure 1.9.
  16. Let's discuss how it works next.

How it works…

You have created your first Azure function using HTTP triggers and have made a few modifications to the default code. The code accepts the firstname and lastname parameters and prints the name of the end user with a Hello {firstname} {lastname} message as a response. You also learned how to test the HTTP trigger function right from the Azure Management portal.

Note

For the sake of simplicity, validation of the input parameters is not executed in this exercise. Be sure to validate all input parameters in applications running in a production environment.

See also

  • The Enabling authorization for function apps recipe in Chapter 9, Configuring security for Azure Functions.

In the next recipe, you'll learn about persisting employee details.

 

Persisting employee details using Azure Table Storage output bindings

In the previous recipe, you created an HTTP trigger and accepted input parameters. Now, let's learn how to store input data in a persistent medium. Azure Functions supports many ways to store data. For this example, we'll store data in Azure Table storage, a NoSQL key-value persistent medium for storing semi-structured data. Learn more about it at https://azure.microsoft.com/services/storage/tables/.

The primary key of an Azure Table storage table has two parts:

  • Partition key: Azure Table storage records are classified and organized into partitions. Each record located in a partition will have the same partition key (p1 in our example).
  • Row key: A unique value should be assigned to each row.

Getting ready

This recipe showcases the ease of integrating an HTTP trigger and the Azure Table storage service using output bindings. The Azure HTTP trigger function receives data from multiple sources and stores user profile data in a storage table named tblUserProfile. We'll follow the prerequisites listed here:

Let's get started.

How to do it…

Perform the following steps:

  1. Navigate to the Integrate tab of the RegisterUser HTTP trigger function.
  2. Click on the New Output button, select Azure Table Storage, and then click on the Select button:
    Choosing Azure Table Storage output binding
    Figure 1.10: New output bindings
  3. If you are prompted to install the bindings, click on Install; this will take a few minutes. Once the bindings are installed, choose the following settings for the Azure Table Storage output bindings:

    Table parameter name: This is the name of the parameter that will be used in the Run method of the Azure function. For this example, provide objUserProfileTable as the value.

    Table name: A new table in Azure Table storage will be created to persist the data. If the table doesn't exist already, Azure will automatically create one for you! For this example, provide tblUserProfile as the table name.

    Storage account connection: If the Storage account connection string is not displayed, click on new (as shown in Figure 1.11) to create a new one or to choose an existing storage account.

    The Azure Table storage output bindings should be as shown in Figure 1.11:

    Azure Table Storage output bindings settings
    Figure 1.11: Azure Table Storage output bindings settings
  4. Click on Save to save the changes.
  5. Navigate to the code editor by clicking on the function name.

    Note

    The following are the initial lines of the code for this recipe:

    #r "Newtonsoft.json"

    #r "Microsoft.WindowsAzure.Storage"

    The preceding lines of code instruct the function runtime to include a reference to the specified library.

    Paste the following code into the editor. The code will accept the input passed by the end user and save it in Table storage; click Save:

    #r "Newtonsoft.Json" 
    #r "Microsoft.WindowsAzure.Storage" 
    using System.Net; 
    using Microsoft.AspNetCore.Mvc; 
    using Microsoft.Extensions.Primitives; 
    using Newtonsoft.Json; 
    using Microsoft.WindowsAzure.Storage.Table; 
    public static async Task<IActionResult> Run( 
    HttpRequest req,
    CloudTable objUserProfileTable, 
    ILogger log) 
    {
        log.LogInformation("C# HTTP trigger function processed a request."); 
        string firstname=null,lastname = null;
        string requestBody = await new 
    StreamReader(req.Body).ReadToEndAsync(); 
        dynamic inputJson = JsonConvert.DeserializeObject(requestBody);
        firstname = firstname ?? inputJson?.firstname; 
        lastname = inputJson?.lastname; 
        UserProfile objUserProfile = new UserProfile(firstname, lastname);
        TableOperation objTblOperationInsert =
        TableOperation.Insert(objUserProfile); 
        await objUserProfileTable.ExecuteAsync(objTblOperationInsert); 
        return (lastname + firstname) != null 
        ? (ActionResult)new OkObjectResult($"Hello, {firstname + " " + lastname}") 
        : new BadRequestObjectResult("Please pass a name on the query" + "string or in the request body"); 
    }
    class UserProfile : TableEntity 
      {
        public UserProfile(string firstName,string lastName) 
        {
            this.PartitionKey = "p1"; 
            this.RowKey = Guid.NewGuid().ToString(); 
            this.FirstName = firstName; 
            this. LastName = lastName; 
        }
        UserProfile() { } 
        public string FirstName { get; set; }
        public string LastName { get; set; } 
      }
  6. Execute the function by clicking on the Run button of the Test tab by passing the firstname and lastname parameters to the Request body.
  7. If there are no errors, you'll get a Status: 200 OK message as the output. Navigate to Azure Storage Explorer and view the Table storage to see whether a table named tblUserProfile was created successfully:
    Viewing data in Storage Explorer
Figure 1.12: Viewing data in Storage Explorer

How it works…

Azure Functions allows us to easily integrate with other Azure services just by adding an output binding to a trigger. In this example, we have integrated an HTTP trigger with the Azure Table storage binding. We also configured an Azure storage account by providing a storage connection string and the Azure Table storage in which we would like to create a record for each of the HTTP requests received by the HTTP trigger.

We have also added an additional parameter to handle the Table storage, named objUserProfileTable, of the CloudTable type, to the Run method. We can perform all the operations on Azure Table storage using objUserProfileTable.

Note

The input parameters are not validated in the code sample. However, in a production environment, it's important to validate them before storing them in any kind of persistent medium.

We also created a UserProfile object and filled it with the values received in the request object, and then passed it to the table operation.

Note

Learn more about handling operations on the Azure Table storage service at https://docs.microsoft.com/azure/cosmos-db/tutorial-develop-table-dotnet.

Understanding storage connections

When you create a new storage connection (refer to step 3 of the How to do it... section of this recipe), a new App settings application will be created:

Application settings in the configuration pane
Figure 1.13: Application settings in the configuration blade

Navigate to App settings by clicking on the Configuration menu available in the General Settings section of the Platform features tab:

Clicking on the Configuration link in the General Settings section of the Platform features tab
Figure 1.14: Configuration blade

You learned how to save data quickly using Azure Table storage bindings. In the next recipe, you'll learn how to save profile picture paths to queues.

 

Saving profile picture paths to queues using queue output bindings

The previous recipe highlighted how to receive two string parameters, firstname and lastname, in the Request body and store them in Azure Table storage. In this recipe, let's add a new parameter named ProfilePicUrl for the profile picture of the user that is publicly accessible via the internet. In this recipe (and the next), you'll learn about the process of extracting the URL of an image and saving it in the blob container of an Azure storage account.

While the ProfilePicUrl input parameter can be used to download the picture from the internet, in the previous recipe, Persisting employee details using Azure Table storage output bindings, this was not feasible due to the time required to process the large size of the image, which might hinder the performance of the overall application. For this reason, it is faster to grab the URL of the profile picture and store it in a queue, which can be processed later before storing it in the blob.

Getting ready

We'll be updating the code of the RegisterUser function that was used in the previous recipes.

How to do it…

Perform the following steps:

  1. Navigate to the Integrate tab of the RegisterUser HTTP trigger function.
  2. Click on the New Output button, select Azure Queue Storage, and then click on the Select button.
  3. Provide the following parameters in the Azure Queue Storage output settings:

    Message parameter name: Set the name of the parameter to objUserProfileQueueItem, which will be used in the Run method.

    Queue name: Set the queue name to userprofileimagesqueue.

    Storage account connection: It is important to select the right storage account in the Storage account connection field.

  4. Click on Save to create the new output binding.
  5. Navigate back to the code editor by clicking on the function name (RegisterUser in this example) or the run.csx file and make the changes shown in the following code:
    public static async Task<IActionResult> Run( HttpRequest req,
    CloudTable objUserProfileTable, IAsyncCollector<string> public static async Task<IActionResult> Run( 
        HttpRequest req,
        CloudTable objUserProfileTable,
        IAsyncCollector<string> objUserProfileQueueItem,
        ILogger log)
        {....
        string firstname= inputJson.firstname;
        string profilePicUrl = inputJson.ProfilePicUrl;
        await objUserProfileQueueItem.AddAsync(profilePicUrl);
        ....
        objUserProfileTable.Execute(objTblOperationInsert);
        }
  6. In the preceding code, you have added queue output bindings by adding the IAsyncCollecter parameter to the Run method and just passing the required message to the AddAsync method. The output bindings will take care of saving ProfilePicUrl to the queue. Now, click on Save to save the code changes in the code editor of the run.csx file.
  7. Let's test the code by adding another parameter, ProfilePicUrl, to the Request body and then clicking on the Run button in the Test tab of the Azure Functions code editor window. Replace "URL here" with the URL of an image that's accessible over the internet; you'll need to make sure that the image URL provided is valid:
    {
    "firstname": "Bill",
    "lastname": "Gates", 
    "ProfilePicUrl":"URL here"
    }
  8. If everything goes fine, you'll see the Status: 200 OK message again. Then, the image URL that was passed as an input parameter in to the Request body will be created as a queue message in the Azure Queue storage service. Let's navigate to Azure Storage Explorer and view the queue named userprofileimagesqueue, which is the queue name that was provided in step 3.
  9. Figure 1.15 represents the queue message that was created:
    Viewing the output in Storage Explorer
Figure 1.15: Viewing the output in Storage Explorer

How it works…

In this recipe, we added a queue message output binding and made the following changes to our existing code:

  • We added a new parameter named out string objUserProfileQueueItem, which binds the URL of the profile picture as queue message content.
  • We used the AddAsync method of IAsyncCollector in the Run method that saves the profile URL to the queue as a queue message.

In this recipe, you learned how to receive a URL of an image and save it in the blob container of an Azure storage account. In the next recipe, we'll store an image in Azure Blob Storage.

 

Storing images in Azure Blob Storage

The previous recipe explained how to store an image URL in a queue message. Let's learn how to trigger an Azure function (queue trigger) when a new queue item is added to the Azure Queue storage service. Each message in the queue is a URL of the profile picture of a user, which will be processed by Azure Functions and stored as a blob in the Azure Blob Storage service.

Getting ready

While the previous recipe focused on creating queue output bindings, this recipe will explain how to grab an image's URL from a queue, create a corresponding byte array, and then write it to a blob.

Note that this recipe is a continuation of the previous recipes. Be sure to implement them first.

How to do it…

Perform the following steps:

  1. Create a new Azure function by choosing Azure Queue Storage Trigger from the templates.
  2. Provide the following details after choosing the template:

    Name the function: Provide a meaningful name, such as CreateProfilePictures.

    Queue name: Name the queue userprofileimagesqueue. This will be monitored by the Azure function. Our previous recipe created a new item for each of the valid requests coming to the HTTP trigger (named RegisterUser) into the userprofileimagesqueue queue. For each new entry of a queue message to this queue storage, the CreateProfilePictures trigger will be executed automatically.

    Storage account connection: Connection of the storage account based on where the queues are located.

  3. Review all the details and click on Create to create the new function.
  4. Navigate to the Integrate tab, click on New Output, choose Azure Blob Storage, and then click on the Select button.
  5. In the Azure Blob Storage output section, provide the following:

    Blob parameter name: Set this to outputBlob.

    Path: Set this to userprofileimagecontainer/{rand-guid}.

    Storage account connection: Choose the storage account for saving the blobs and click on the Save button:

    Azure Blob storage output binding settings
    Figure 1.16: Azure Blob storage output binding settings
  6. Click on the Save button to save all the changes.
  7. Replace the default code of the run.csx file of the CreateProfilePictures function with the following code. The code grabs the URL from the queue, creates a byte array, and then writes it to a blob:
    using System;
    public static void Run(Stream outputBlob, string myQueueItem, ILogger log) {
      byte[] imageData = null;
      using(var wc = new System.Net.WebClient()) 
      {
        imageData = wc.DownloadData(myQueueItem);
      }
      outputBlob.WriteAsync(imageData, 0, imageData.Length);
    }
  8. Click on the Save button to save the changes. Make sure that there are no compilation errors in the Logs window.
  9. Let's go back to the RegisterUser function and test it by providing the firstname, lastname, and ProfilePicUrl fields, as we did in the Saving profile picture paths to queues using queue output bindings recipe.
  10. Navigate to the Azure Storage Explorer window and look at the userprofileimagecontainer blob container. You should find a new blob:
    Viewing the output in Azure Storage Explorer
Figure 1.17: Azure Storage Explorer

The image shown in Figure 1.17 can be viewed through any image viewing tool (such as MS Paint or Internet Explorer).

How it works…

We have created a queue trigger that gets executed when a new message arrives in the queue. Once it finds a new queue message, it reads the message, which is the URL of a profile picture. The function makes a web client request, downloads the image data in the form of a byte array, and then writes the data into the output blob.

There's more…

The rand-guid parameter will generate a new GUID and is assigned to the blob that gets created each time the trigger is fired.

Note

It is mandatory to specify the blob container name in the Path parameter of the Blob storage output binding while configuring the Blob storage output. Azure Functions creates the container automatically if it doesn't exist.

Queue messages can only be used to store messages up to 64 KB in size. To store messages greater than 64 KB, developers must use Azure Service Bus.

In this recipe, you learned how to invoke an Azure function when a new queue item is added to the Azure Storage Queue service. In the next recipe, you'll learn how to resize an image.

 

Resizing an image using an ImageResizer trigger

With the recent revolution in high-end smartphone cameras, it has become easy to capture high-quality pictures that tend to have larger sizes. While a good quality picture is beneficial to the consumer, for an application developer or administrator, it proves to be a pain to manage the storage of a popular website, since most platforms recommend that users upload high-quality profile pictures. Given the dilemma, it makes sense to make use of libraries that help us reduce the size of high-quality images while maintaining aspect ratio and quality.

This recipe will focus on implementing the functionality of resizing images without losing quality using one of the NuGet packages called SixLabors.ImageSharp.

Getting ready

In this recipe, you'll learn how to use a library named SixLabors to resize an image to the required dimensions. For the sake of simplicity, we'll resize the image to the following dimensions:

  • Medium with 200*200 pixels.
  • Small with 100*100 pixels.

How to do it…

  1. Create a new Azure function by choosing Azure Blob Storage Trigger from the templates.
  2. Provide the following details after choosing the template:

    Name the function: Provide a meaningful name, such as ResizeProfilePictures.

    Path: Set this to userprofileimagecontainer/{name}.

    Storage account connection: Choose the storage account for saving the blobs and click on the Save button.

  3. Review all the details and click on Create to create the new function.
  4. Once the function is created, navigate to the Integrate tab, click on New Output, and choose Azure Blob Storage.
  5. In the Azure Blob Storage output section, provide the following:

    Blob parameter name: Set this to imageSmall.

    Path: Set this to userprofilesmallimagecontainer/{name}.

    Storage account connection: Choose the storage account for saving the blobs and click on the Save button.

  6. In the previous step, we added an output binding for creating a small image. In this step, let's create a medium image. Click on New Output and choose Azure Blob Storage. In the Azure Blob Storage output section, provide the following:

    Blob parameter name: Set this to imageMedium.

    Path: Set this to userprofilemediumimagecontainer/{name}.

    Storage account connection: Choose the storage account for saving the blobs and click on the Save button.

  7. Now, we need to add the NuGet package references to the Function App. In order to add the packages, a file named function.proj needs to be created, as shown in Figure 1.18:
    Adding a new function.proj file to add Nugetpackages
    Figure 1.18: Adding a new file
  8. Open the function.proj file, paste the following content to download the libraries related to SixLabors.ImageSharp, and then click on the Save button:
    <Project Sdk="Microsoft.NET.Sdk">
        <PropertyGroup>
            <TargetFramework>netstandard2.0</TargetFramework>
        </PropertyGroup>
        <ItemGroup>
            <PackageReference Include="SixLabors.ImageSharp" Version="1.0.0-beta0007" />
        </ItemGroup>
    </Project>
  9. Once the package reference code has been added in the previous step, you'll be able to view a Logs window similar to Figure 1.19. Note that the compiler may throw a warning in this step, which can be ignored:
    Viewing the logs, showing the installation of the Nugetpackages
    Figure 1.19: A Logs window
  10. Now, let's navigate to the code editor and paste the following code:
    using SixLabors.ImageSharp;
    using SixLabors.ImageSharp.Formats;
    using SixLabors.ImageSharp.PixelFormats;
    using SixLabors.ImageSharp.Processing;
    public static void Run(Stream myBlob, string name,Stream imageSmall,Stream imageMedium, ILogger log)
    {
               try
                {
                    IImageFormat format;
                    using (Image<Rgba32> input = Image.Load<Rgba32>(myBlob, out format))
                    {
                        ResizeImageAndSave(input, imageSmall, ImageSize.Small, format);
                    }
                    myBlob.Position = 0;
                    using (Image<Rgba32> input = Image.Load<Rgba32>(myBlob, out format))
                    {
                        ResizeImageAndSave(input, imageMedium, ImageSize.Medium, format);
                    }
                }
                catch (Exception e)
                {
                    log.LogError(e, $"unable to process the blob");
                }
    }
            public static void ResizeImageAndSave(Image<Rgba32> input, Stream output, ImageSize size, IImageFormat format)
            {
                var dimensions = imageDimensionsTable[size];
                input.Mutate(x => x.Resize(width: dimensions.Item1, height: dimensions.Item2));
                input.Save(output, format);
            }
            public enum ImageSize { ExtraSmall, Small, Medium }
            private static Dictionary<ImageSize, (int, int)> imageDimensionsTable = new Dictionary<ImageSize, (int, int)>() 
            {
                { ImageSize.Small,      (100, 100) },
                { ImageSize.Medium,     (200, 200) }
            };
  11. Now, navigate to the RegisterUser function and run it again. If everything is configured properly, the new containers should be created, as shown in Figure 1.20:
    Viewing the output in Azure Storage Explorer. The three containers have been created
    Figure 1.20: Azure Storage Explorer
  12. Review the new images created in the new containers with the proper sizes, as shown in Figure 1.21:
    Viewing the various images with different sizes
Figure 1.21: Displaying the output

How it works…

Figure 1.22 shows how the execution of the functions is triggered like a chain:

Illustration of the execution of the functions
Figure 1.22: Illustration of the execution of the functions

We have created a new blob trigger function sample named ResizeProfilePictures, which will be triggered immediately after the original blob (image) is uploaded. Whenever a new blob is created in the userprofileimagecontainer blob, the function will create two resized versions in each of the containers—userprofilesmallimagecontainer and userprofilemediumimagecontainer—automatically.

About the Author

  • Praveen Kumar Sreeram

    Praveen Kumar Sreeram is an author, Microsoft Certified Trainer, and certified Azure Solutions Architect. He has over 15 years of experience in the field of development, analysis, design, and the delivery of applications of various technologies. His projects range from custom web development using ASP.NET and MVC to building mobile apps using the cross-platform Xamarin technology for domains such as insurance, telecom, and wireless expense management. He has been given the Most Valuable Professional award twice by one of the leading social community websites, CSharpCorner, for his contributions to the Microsoft Azure community through his articles. Praveen is highly focused on learning about technology, and blogs about his learning regularly. You can also follow him on Twitter at @PrawinSreeram. Currently, his focus is on analyzing business problems and providing technical solutions for various projects related to Microsoft Azure and .NET Core.

    Browse publications by this author

Latest Reviews

(1 reviews total)
livre bien et tres lisible

Recommended For You

Book Title
Unlock this book and the full library for FREE
Start free trial