Reader small image

You're reading from  Azure Data Factory Cookbook

Product typeBook
Published inDec 2020
PublisherPackt
ISBN-139781800565296
Edition1st Edition
Right arrow
Authors (4):
Dmitry Anoshin
Dmitry Anoshin
author image
Dmitry Anoshin

Dmitry Anoshin is a data-centric technologist and a recognized expert in building and implementing big data and analytics solutions. He has a successful track record when it comes to implementing business and digital intelligence projects in numerous industries, including retail, finance, marketing, and e-commerce. Dmitry possesses in-depth knowledge of digital/business intelligence, ETL, data warehousing, and big data technologies. He has extensive experience in the data integration process and is proficient in using various data warehousing methodologies. Dmitry has constantly exceeded project expectations when he has worked in the financial, machine tool, and retail industries. He has completed a number of multinational full BI/DI solution life cycle implementation projects. With expertise in data modeling, Dmitry also has a background and business experience in multiple relation databases, OLAP systems, and NoSQL databases. He is also an active speaker at data conferences and helps people to adopt cloud analytics.
Read more about Dmitry Anoshin

Dmitry Foshin
Dmitry Foshin
author image
Dmitry Foshin

Dmitry Foshin is a business intelligence team leader, whose main goals are delivering business insights to the management team through data engineering, analytics, and visualization. He has led and executed complex full-stack BI solutions (from ETL processes to building DWH and reporting) using Azure technologies, Data Lake, Data Factory, Data Bricks, MS Office 365, PowerBI, and Tableau. He has also successfully launched numerous data analytics projects – both on-premises and cloud – that help achieve corporate goals in international FMCG companies, banking, and manufacturing industries.
Read more about Dmitry Foshin

Roman Storchak
Roman Storchak
author image
Roman Storchak

Roman Storchak is a PhD, and is a chief data officer whose main interest lies in building data-driven cultures through making analytics easy. He has led teams that have built ETL-heavy products in AdTech and retail and often uses Azure Stack, PowerBI, and Data Factory.
Read more about Roman Storchak

Xenia Ireton
Xenia Ireton
author image
Xenia Ireton

Xenia Ireton is a Senior Software Engineer at Microsoft. She has extensive knowledge in building distributed services, data pipelines and data warehouses.
Read more about Xenia Ireton

View More author details
Right arrow

Chapter 6: Integration with MS SSIS

SQL Server Integration Services is a highly capable ETL/ELT tool that is built into Microsoft SQL Server. Often, it is used to move and transform data on-premises and in cloud environments. Its features somewhat resemble the main features of Azure Data Factory (ADF). At the same time, ADF has high availability and is massively scalable.

ADF adds convenience as it allows the execution of SSIS packages in Azure-SSIS integration or on-premises runtimes. With this tool, you can leverage existing SSIS packages and use familiar development tools.

Running SSIS packages is an essential feature in ADF. In this recipe, we will create a basic SSIS package, deploy it to a managed Azure SQL database, and then trigger its execution in ADF.

In this chapter, we will cover the following topics:

  • Creating a SQL server database
  • Building an SSIS package
  • Running SSIS packages from ADF

By the end of the chapter, you will be able to build...

Technical requirements

You need to have access to Microsoft Azure. An Azure free account is sufficient for all recipes in this chapter. Also, you have to download and install Visual Studio 2019. During installation, check the Data storage and processing and SQL Server Data Tools fields. You can use the following tutorial: https://docs.microsoft.com/en-us/sql/ssdt/download-sql-server-data-tools-ssdt?view=sql-server-ver15#ssdt-for-visual-studio-2019.

Creating a SQL Server database

SSIS is part of the SQL Server ecosystem. Hence, we need to create a SQL Server and a database and fill it with the data that will be used in the following recipes to deploy and run SSIS packages.

Getting ready

To get started with your recipe, log in to your Microsoft Azure account.

We assume that you have pre-configured a resource group and storage account with Azure Data Lake Gen2.

How to do it…

Let's prepare a SQL Server database. Later, it will be used to run the SQL Server Integration Service package. Perform the following steps:

  1. Go to SQL databases on Azure, as shown in the following screenshot:

    Figure 6.1 – Adding a SQL database

    Next, click + Add and then fill in the Database name field and click Create new, as shown in the following screenshot:

    Figure 6.2 – Setting a SQL Server

  2. Fill in the Server name, Server admin login, Password, and Confirm password fields. Select a Location. Preferably...

Building an SSIS package

SSIS packages are a great way to build ETL/ELT processes with SQL Server. Packages (or projects) store the sequence of steps that are performed to execute an activity. Let's build a package that will connect to the SQL Server database, which stores the movielens dataset, preprocess it, and store the output in a new table.

Getting ready

In order to follow this recipe, you need to finish the Creating a SQL Server database recipe, as we will build an SSIS package that will use both a database and data that we have prepared previously.

Then, you have to install the most recent version of Visual Studio and log in to Azure via Visual Studio. Visual Studio Community edition fully supports all the features that we will use in the following recipes.

How to do it…

Let's build an SSIS package that joins two tables and deploy it to SSISDB in Azure:

  1. Open Visual Studio and create a new project.
  2. Select Integration Services Project...

Running SSIS packages from ADF

SSIS packages are commonly used in on-premises tasks. In order to leverage existing infrastructure and scale up operations, SSIS packages can be run in the cloud. Let's prepare a cloud infrastructure, deploy a package that we created in a previous recipe to the Azure Cloud, and then trigger its execution via ADF.

Getting ready

Log in to your Microsoft Azure account.

We assume that you have followed all the previous recipes from this chapter. We will use the outcomes of all of them.

How to do it…

Let's create a SQL database that we will need to store data and host SSIS packages:

  1. Go to the Azure Data Factory interface that you use for learning on the Author & Monitor, Manage, Integration runtimes page (see Figure 6.26).
  2. To add a new integration runtime, click + New:

    Figure 6.23 – Integration runtimes in Azure Data Factory

  3. Select Azure-SSIS (Lift-and-shift existing SSIS packages to execute in Azure...
lock icon
The rest of the chapter is locked
You have been reading a chapter from
Azure Data Factory Cookbook
Published in: Dec 2020Publisher: PacktISBN-13: 9781800565296
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
undefined
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime

Authors (4)

author image
Dmitry Anoshin

Dmitry Anoshin is a data-centric technologist and a recognized expert in building and implementing big data and analytics solutions. He has a successful track record when it comes to implementing business and digital intelligence projects in numerous industries, including retail, finance, marketing, and e-commerce. Dmitry possesses in-depth knowledge of digital/business intelligence, ETL, data warehousing, and big data technologies. He has extensive experience in the data integration process and is proficient in using various data warehousing methodologies. Dmitry has constantly exceeded project expectations when he has worked in the financial, machine tool, and retail industries. He has completed a number of multinational full BI/DI solution life cycle implementation projects. With expertise in data modeling, Dmitry also has a background and business experience in multiple relation databases, OLAP systems, and NoSQL databases. He is also an active speaker at data conferences and helps people to adopt cloud analytics.
Read more about Dmitry Anoshin

author image
Dmitry Foshin

Dmitry Foshin is a business intelligence team leader, whose main goals are delivering business insights to the management team through data engineering, analytics, and visualization. He has led and executed complex full-stack BI solutions (from ETL processes to building DWH and reporting) using Azure technologies, Data Lake, Data Factory, Data Bricks, MS Office 365, PowerBI, and Tableau. He has also successfully launched numerous data analytics projects – both on-premises and cloud – that help achieve corporate goals in international FMCG companies, banking, and manufacturing industries.
Read more about Dmitry Foshin

author image
Roman Storchak

Roman Storchak is a PhD, and is a chief data officer whose main interest lies in building data-driven cultures through making analytics easy. He has led teams that have built ETL-heavy products in AdTech and retail and often uses Azure Stack, PowerBI, and Data Factory.
Read more about Roman Storchak

author image
Xenia Ireton

Xenia Ireton is a Senior Software Engineer at Microsoft. She has extensive knowledge in building distributed services, data pipelines and data warehouses.
Read more about Xenia Ireton