Reader small image

You're reading from  Azure Data Factory Cookbook - Second Edition

Product typeBook
Published inFeb 2024
PublisherPackt
ISBN-139781803246598
Edition2nd Edition
Right arrow
Authors (4):
Dmitry Foshin
Dmitry Foshin
author image
Dmitry Foshin

Dmitry Foshin is a business intelligence team leader, whose main goals are delivering business insights to the management team through data engineering, analytics, and visualization. He has led and executed complex full-stack BI solutions (from ETL processes to building DWH and reporting) using Azure technologies, Data Lake, Data Factory, Data Bricks, MS Office 365, PowerBI, and Tableau. He has also successfully launched numerous data analytics projects – both on-premises and cloud – that help achieve corporate goals in international FMCG companies, banking, and manufacturing industries.
Read more about Dmitry Foshin

Tonya Chernyshova
Tonya Chernyshova
author image
Tonya Chernyshova

Tonya Chernyshova is an experienced Data Engineer with over 10 years in the field, including time at Amazon. Specializing in Data Modeling, Automation, Cloud Computing (AWS and Azure), and Data Visualization, she has a strong track record of delivering scalable, maintainable data products. Her expertise drives data-driven insights and business growth, showcasing her proficiency in leveraging cloud technologies to enhance data capabilities.
Read more about Tonya Chernyshova

Dmitry Anoshin
Dmitry Anoshin
author image
Dmitry Anoshin

Dmitry Anoshin is a data-centric technologist and a recognized expert in building and implementing big data and analytics solutions. He has a successful track record when it comes to implementing business and digital intelligence projects in numerous industries, including retail, finance, marketing, and e-commerce. Dmitry possesses in-depth knowledge of digital/business intelligence, ETL, data warehousing, and big data technologies. He has extensive experience in the data integration process and is proficient in using various data warehousing methodologies. Dmitry has constantly exceeded project expectations when he has worked in the financial, machine tool, and retail industries. He has completed a number of multinational full BI/DI solution life cycle implementation projects. With expertise in data modeling, Dmitry also has a background and business experience in multiple relation databases, OLAP systems, and NoSQL databases. He is also an active speaker at data conferences and helps people to adopt cloud analytics.
Read more about Dmitry Anoshin

Xenia Ireton
Xenia Ireton
author image
Xenia Ireton

Xenia Ireton is a Senior Software Engineer at Microsoft. She has extensive knowledge in building distributed services, data pipelines and data warehouses.
Read more about Xenia Ireton

View More author details
Right arrow

Extending Azure Data Factory with Logic Apps and Azure Functions

The Azure ecosystem comprises a variety of different services. Most of them can be integrated with and connected to Azure Data Factory (ADF). In this chapter, we will show you how to harness the power of serverless execution by integrating some of the most commonly used Azure services: Azure Logic Apps and Azure Functions. These recipes will help you understand how Azure services can be useful in designing Extract, Transform, Load (ETL) pipelines.

We will cover the following recipes in this chapter:

  • Triggering your data processing with Logic Apps
  • Using the Web activity to call an Azure logic app
  • Adding flexibility to your pipelines with Azure Functions

Join our book community on Discord

https://packt.link/p3FAF

A qr code on a white background Description automatically generated

When your business needs to move data between cloud providers, Azure Data Factory presents a convenient and robust interface for this task. Microsoft provides connectors to integrate the data factory with multiple third-party services, including Amazon Web Services (AWS) and Google Cloud. In this chapter, we will walk though several illustrative examples on migrating data from these two cloud providers. In addition, you will learn how to use Azure Data Factory's Custom Activity to work with providers who are not supported by Microsoft's built-in connectors.In this chapter, we will cover the following recipes:

  • Copying data from Amazon S3 to Azure Blob storage -
  • Copying large datasets from S3 to ADLS
  • Copying data from Google Cloud Storage to Azure Data Lake
  • Copying data from Google BigQuery to Azure Data Lake Store
  • Migrating data from Google BigQuery to Azure Synapse
  • Copying data from Snowflake to Azure Data Lake Store...

Technical requirements

All recipes in this chapter assume that you have a Microsoft Azure account and an instance of a data factory. Refer to Chapter 1, Getting Started with ADF, for instructions on how to set up your Azure account and create a data factory.For the recipes in this chapter, you will need accounts with sufficient permissions on third-party services. For recipes 1 and 2, you will need to set up an account with AWS. For recipes 3, 4, and 5, you will need a Google Cloud account. For recipe 6 you will need Snowflake account.If you do not have accounts already set up with the aforementioned services, you can do this for free:

Copying data from Amazon S3 to Azure Blob storage

In this recipe, you will learn how to copy data from an AWS S3 bucket to the Azure Blob storage container using a data factory.

Getting ready

This recipe requires you to have an AWS account and an S3 bucket. Refer to the Technical requirements section to find out how to set up a new AWS account if you do not have one. Once you have your AWS account set up, go to https://s3.console.aws.amazon.com/s3/ to create a bucket. Upload the sample CSV files from https://github.com/PacktPublishing/Azure-Data-Factory-Cookbook/tree/master/data to your bucket.

How to do it…

Rather than designing the pipeline in the Author tab, we will use the Copy Data Wizard. The Copy Data Wizard will walk you through pipeline creation step by step, and will create and run the pipeline for you:

  1. Go to the home page of Azure Data Factory and select Ingest tile to start the Copy Data Wizard.

    Figure 6.1 – Copy Data Wizard interface
  2. We need to define...

Copying large datasets from S3 to ADLS

Azure Data Factory can help you move very large datasets into the Azure ecosystem with speed and efficiency. The key to moving large datasets is data partitioning. The way you partition depends heavily on the nature of your data.In the following recipe, we will illustrate a methodology to utilize a data partitioning table for moving a large dataset. We will use a public Common Crawl dataset, which contains petabytes of web crawl data from 2008 to the present day. It is a public dataset hosted on the AWS S3 platform. We will only use a small subset of this data for our example, enough to illustrate the power of data factory parallel processing.

Getting ready

In order to access Amazon Web Services, such as an S3 bucket, you need to have proper credentials. These credentials consist of an access key ID (for example, AKFAGOKFOLNN7EXAMPL8) and the secret access key itself (for example, pUgkrUXtPFEer/PO9rbNG/bPxRgiMYEXAMPLEKEY). In this book, we will...

lock icon
The rest of the chapter is locked
You have been reading a chapter from
Azure Data Factory Cookbook - Second Edition
Published in: Feb 2024Publisher: PacktISBN-13: 9781803246598
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
undefined
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime

Authors (4)

author image
Dmitry Foshin

Dmitry Foshin is a business intelligence team leader, whose main goals are delivering business insights to the management team through data engineering, analytics, and visualization. He has led and executed complex full-stack BI solutions (from ETL processes to building DWH and reporting) using Azure technologies, Data Lake, Data Factory, Data Bricks, MS Office 365, PowerBI, and Tableau. He has also successfully launched numerous data analytics projects – both on-premises and cloud – that help achieve corporate goals in international FMCG companies, banking, and manufacturing industries.
Read more about Dmitry Foshin

author image
Tonya Chernyshova

Tonya Chernyshova is an experienced Data Engineer with over 10 years in the field, including time at Amazon. Specializing in Data Modeling, Automation, Cloud Computing (AWS and Azure), and Data Visualization, she has a strong track record of delivering scalable, maintainable data products. Her expertise drives data-driven insights and business growth, showcasing her proficiency in leveraging cloud technologies to enhance data capabilities.
Read more about Tonya Chernyshova

author image
Dmitry Anoshin

Dmitry Anoshin is a data-centric technologist and a recognized expert in building and implementing big data and analytics solutions. He has a successful track record when it comes to implementing business and digital intelligence projects in numerous industries, including retail, finance, marketing, and e-commerce. Dmitry possesses in-depth knowledge of digital/business intelligence, ETL, data warehousing, and big data technologies. He has extensive experience in the data integration process and is proficient in using various data warehousing methodologies. Dmitry has constantly exceeded project expectations when he has worked in the financial, machine tool, and retail industries. He has completed a number of multinational full BI/DI solution life cycle implementation projects. With expertise in data modeling, Dmitry also has a background and business experience in multiple relation databases, OLAP systems, and NoSQL databases. He is also an active speaker at data conferences and helps people to adopt cloud analytics.
Read more about Dmitry Anoshin

author image
Xenia Ireton

Xenia Ireton is a Senior Software Engineer at Microsoft. She has extensive knowledge in building distributed services, data pipelines and data warehouses.
Read more about Xenia Ireton