Reader small image

You're reading from  Azure Data Factory Cookbook

Product typeBook
Published inDec 2020
PublisherPackt
ISBN-139781800565296
Edition1st Edition
Right arrow
Authors (4):
Dmitry Anoshin
Dmitry Anoshin
author image
Dmitry Anoshin

Dmitry Anoshin is a data-centric technologist and a recognized expert in building and implementing big data and analytics solutions. He has a successful track record when it comes to implementing business and digital intelligence projects in numerous industries, including retail, finance, marketing, and e-commerce. Dmitry possesses in-depth knowledge of digital/business intelligence, ETL, data warehousing, and big data technologies. He has extensive experience in the data integration process and is proficient in using various data warehousing methodologies. Dmitry has constantly exceeded project expectations when he has worked in the financial, machine tool, and retail industries. He has completed a number of multinational full BI/DI solution life cycle implementation projects. With expertise in data modeling, Dmitry also has a background and business experience in multiple relation databases, OLAP systems, and NoSQL databases. He is also an active speaker at data conferences and helps people to adopt cloud analytics.
Read more about Dmitry Anoshin

Dmitry Foshin
Dmitry Foshin
author image
Dmitry Foshin

Dmitry Foshin is a business intelligence team leader, whose main goals are delivering business insights to the management team through data engineering, analytics, and visualization. He has led and executed complex full-stack BI solutions (from ETL processes to building DWH and reporting) using Azure technologies, Data Lake, Data Factory, Data Bricks, MS Office 365, PowerBI, and Tableau. He has also successfully launched numerous data analytics projects – both on-premises and cloud – that help achieve corporate goals in international FMCG companies, banking, and manufacturing industries.
Read more about Dmitry Foshin

Roman Storchak
Roman Storchak
author image
Roman Storchak

Roman Storchak is a PhD, and is a chief data officer whose main interest lies in building data-driven cultures through making analytics easy. He has led teams that have built ETL-heavy products in AdTech and retail and often uses Azure Stack, PowerBI, and Data Factory.
Read more about Roman Storchak

Xenia Ireton
Xenia Ireton
author image
Xenia Ireton

Xenia Ireton is a Senior Software Engineer at Microsoft. She has extensive knowledge in building distributed services, data pipelines and data warehouses.
Read more about Xenia Ireton

View More author details
Right arrow

Chapter 3: Setting Up a Cloud Data Warehouse

This chapter will cover the key features and benefits of cloud data warehousing and Azure Synapse Analytics. You will learn how to connect and configure Azure Synapse Analytics, load data, build transformation processes, and operate pipelines.

You will navigate Azure Synapse Analytics and learn about its key components and benefits.

You will also learn how to create an Azure Synapse Analytics workspace, to load and transform data in Azure Synapse Analytics.

Then, you will learn how to develop, execute, and monitor pipelines using Azure Synapse.

Here is a list of recipes that will be covered in this chapter:

  • Connecting to Azure Synapse Analytics
  • Loading data to Azure Synapse Analytics using SSMS
  • Loading data to Azure Synapse Analytics using Azure Data Factory
  • Pausing/resuming an Azure SQL pool from Azure Data Factory
  • Creating an Azure Synapse workspace
  • Loading data to Azure Synapse Analytics using bulk...

Technical requirements

For this chapter, you'll need the following:

Connecting to Azure Synapse Analytics

In this recipe, we are going to create and set up a new Azure resource called Azure Synapse Analytics (formerly Azure SQL DW).

Getting ready

Before we start, please ensure that you have an Azure license and are familiar with the basics of Azure resources, such as the following:

  • The Azure portal
  • Creating and deleting Azure resources
  • Managing subscriptions
  • Managing costs and budgets in Azure

Let's get started!

How to do it…

  1. To create a new resource, search for Azure Synapse Analytics (formerly SQL DW) and press Create.
  2. Choose an existing subscription.
  3. Choose a resource group in which you want your new resource to be located:

    Figure 3.1 – Creating an Azure Synapse Analytics instance – basics

  4. Enter a SQL pool name – for example, adfcookbookch1devsqldb:

    Figure 3.2 – Creating an Azure Synapse Analytics instance – SQL pool

  5. Choose a server (or create...

Loading data to Azure Synapse Analytics using SSMS

In this recipe, we are going to configure Azure Synapse Analytics, add a new user, and load data into Azure Synapse Analytics from an external resource, such as Azure Blob storage.

Getting ready

Before we start, please ensure that you have created an Azure storage account and uploaded data into a Blob storage container. Please refer to Chapter 2, Orchestration and Control Flow, for guidelines on how to do that.

You need to upload the dataset from this book's GitHub repository to the container. Then, you need to generate shared access signatures to connect blobs via Azure Synapse Analytics.

You can download the dataset from the book's GitHub repository, or you can use your own: https://github.com/PacktPublishing/Azure-Data-Factory-Cookbook/tree/master/data.

In this section, we will use the following link to an Azure Blob storage container: https://adfcookbookch3adls.blob.core.windows.net/flightscontainer. You...

Loading data to Azure Synapse Analytics using Azure Data Factory

In this recipe, we will look further at how to load data into Azure Synapse Analytics using Azure Data Factory.

Getting ready

Before we start, please ensure that you have created a linked service to a Blob storage container and know how to create a Copy Data statement in Azure Data Factory. Please refer to Chapter 2, Orchestration and Control Flow, for guidelines on how to do that.

How to do it…

To load data into Azure Synapse Analytics using Azure Data Factory, use the following steps:

  1. Before we create a Copy Data statement in Azure Data Factory, we need to create a new table in Azure Synapse Analytics:
    CREATE TABLE [dbo].[Planes]
    ([Name] varchar(100) NOT NULL,
    [IATA_code] varchar(10) NULL,
    [ICAO_code] varchar(10) NULL)
  2. Open Azure Data Factory and launch the Copy data tool (as seen in the following screenshot), then select the linked service to Azure Blob storage or create a new one (refer...

Pausing/resuming an Azure SQL pool from Azure Data Factory

In this recipe, you will create a new Azure Data Factory pipeline that allows you to automatically pause and resume your Azure SQL data warehouse.

Getting ready

Pause your Azure SQL pool before starting this recipe as you are going to resume it automatically using Azure Data Factory.

How to do it…

To pause or resume an Azure SQL pool with Azure Data Factory, use the following steps:

  1. Open the Author section of Azure Data Factory, create a new pipeline, and in the Activities section, choose Web. Rename the activity and the pipeline:

    Figure 3.12 – Azure Data Factory – web activity

    Go to the settings, then copy and paste the following text into URL:

    https://management.azure.com/subscriptions/{subscription-id}/resourceGroups/{resource-group-name}/providers/Microsoft.Sql/servers/{server-name}/databases/{database-name}/resume?api-version=2019-06-01-preview
  2. As you can see, there are some parameters...

Creating an Azure Synapse workspace

Azure Synapse is a combination of capabilities that brings together data integration, SQL analytics that you frequently pair with something such as Power BI, and also Spark for big data processing into a single service for building enterprise analytics solutions. In this recipe, you will learn how to create a new Azure Synapse workspace and migrate your Azure SQL data warehouse into it.

Getting ready

You need to have an Azure subscription, an Azure resource group, and a Synapse SQL pool created.

How to do it…

To create an Azure Synapse workspace, use the following steps:

  1. In the Azure portal, click on Create new resource and select Azure Synapse Analytics (workspaces preview).
  2. Select your subscription and resource group:

    Figure 3.17 – Creating an Azure Synapse workspace

  3. Enter a new workspace name and select a region. You can either create a new Azure Data Lake Storage Gen2 account and filename or use existing...

Loading data to Azure Synapse Analytics using bulk load

Azure Synapse workspaces allow users to simply load data into a SQL pool with minimum mouse clicks. In this recipe, you will learn how to do this.

Getting ready

You need to have created an Azure Synapse workspace and a SQL pool, and Azure Data Lake Storage Gen2 should be linked to that workspace. The Flights dataset (or any other dataset) should be uploaded to your storage.

How to do it…

  1. Open the Azure Synapse workspace (also known as Synapse Studio).
  2. Click on the Data tab on the left side of your screen.
  3. Expand your SQL pool and click on Actions to the right of Tables. Select New SQL script | New table:

    Figure 3.25 – Creating a new SQL script table in the Synapse Analytics workspace

  4. An automatically generated SQL query for a new table will be shown in the canvas. Replace it with the following script:
    CREATE TABLE [dbo].[Routes]
    (
        Airline VARCHAR(10) NOT NULL,
    &...

Copying data in Azure Synapse Orchestrate

In this recipe, you will create a Copy Data pipeline using Azure Synapse Orchestrate.

Getting ready

You need to have an Azure Synapse workspace created and the Flights database loaded into an Azure Synapse SQL pool.

How to do it…

To copy data in Azure Synapse Orchestrate, use the following steps:

  1. Open the Azure Synapse workspace and go to Orchestrate.
  2. Add a new resource, select Pipeline, then select the Copy data activity, and rename it:

    Figure 3.31 – Creating a new pipeline with the Orchestrate tool of the Synapse Analytics workspace

  3. In the Source section, create a new source with a connection to Azure Synapse Analytics. Select Linked service, then enter the database name and the name of the table: dbo.Routes. Test the connection. You can also click Preview data to ensure that the table is loading correctly:

    Figure 3.32 – Specifying a connection in the Orchestrate tool of the Synapse Analytics...

Using SQL on-demand

In this recipe, you will learn how to use SQL on-demand in an Azure Synapse workspace.

Getting ready

You need to have an Azure Synapse workspace created and the file in Parquet format kept in your Azure Synapse storage account.

How to do it…

  1. Open the Azure Synapse workspace, go to Data, and open the folder that contents the Parquet format file.
  2. Right-click on the file and choose New SQL script | Select TOP 100 rows:

    Figure 3.37 – Creating a new SQL script for a file in a storage account

    A new script is created for connecting to the file using SQL on-demand:

    Figure 3.38 – Connecting to the file from the Synapse workspace using SQL on-demand

    Note

    The query executes within several seconds; you don't need to wait several minutes for the cluster to start. You can copy this script and then paste it into SSMS.

  3. You can also use SQL on-demand to connect from SSMS or a visualization tool (such as Power BI or Tableau). For this...
lock icon
The rest of the chapter is locked
You have been reading a chapter from
Azure Data Factory Cookbook
Published in: Dec 2020Publisher: PacktISBN-13: 9781800565296
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
undefined
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime

Authors (4)

author image
Dmitry Anoshin

Dmitry Anoshin is a data-centric technologist and a recognized expert in building and implementing big data and analytics solutions. He has a successful track record when it comes to implementing business and digital intelligence projects in numerous industries, including retail, finance, marketing, and e-commerce. Dmitry possesses in-depth knowledge of digital/business intelligence, ETL, data warehousing, and big data technologies. He has extensive experience in the data integration process and is proficient in using various data warehousing methodologies. Dmitry has constantly exceeded project expectations when he has worked in the financial, machine tool, and retail industries. He has completed a number of multinational full BI/DI solution life cycle implementation projects. With expertise in data modeling, Dmitry also has a background and business experience in multiple relation databases, OLAP systems, and NoSQL databases. He is also an active speaker at data conferences and helps people to adopt cloud analytics.
Read more about Dmitry Anoshin

author image
Dmitry Foshin

Dmitry Foshin is a business intelligence team leader, whose main goals are delivering business insights to the management team through data engineering, analytics, and visualization. He has led and executed complex full-stack BI solutions (from ETL processes to building DWH and reporting) using Azure technologies, Data Lake, Data Factory, Data Bricks, MS Office 365, PowerBI, and Tableau. He has also successfully launched numerous data analytics projects – both on-premises and cloud – that help achieve corporate goals in international FMCG companies, banking, and manufacturing industries.
Read more about Dmitry Foshin

author image
Roman Storchak

Roman Storchak is a PhD, and is a chief data officer whose main interest lies in building data-driven cultures through making analytics easy. He has led teams that have built ETL-heavy products in AdTech and retail and often uses Azure Stack, PowerBI, and Data Factory.
Read more about Roman Storchak

author image
Xenia Ireton

Xenia Ireton is a Senior Software Engineer at Microsoft. She has extensive knowledge in building distributed services, data pipelines and data warehouses.
Read more about Xenia Ireton