Search icon
Subscription
0
Cart icon
Close icon
You have no products in your basket yet
Arrow left icon
All Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletters
Free Learning
Arrow right icon
Limitless Analytics with Azure Synapse

You're reading from  Limitless Analytics with Azure Synapse

Product type Book
Published in Jun 2021
Publisher Packt
ISBN-13 9781800205659
Pages 392 pages
Edition 1st Edition
Languages
Author (1):
Prashant Kumar Mishra Prashant Kumar Mishra
Profile icon Prashant Kumar Mishra

Table of Contents (20) Chapters

Preface 1. Section 1: The Basics and Key Concepts
2. Chapter 1: Introduction to Azure Synapse 3. Chapter 2: Considerations for Your Compute Environment 4. Section 2: Data Ingestion and Orchestration
5. Chapter 3: Bringing Your Data to Azure Synapse 6. Chapter 4: Using Synapse Pipelines to Orchestrate Your Data 7. Chapter 5: Using Synapse Link with Azure Cosmos DB 8. Section 3: Azure Synapse for Data Scientists and Business Analysts
9. Chapter 6: Working with T-SQL in Azure Synapse 10. Chapter 7: Working with R, Python, Scala, .NET, and Spark SQL in Azure Synapse 11. Chapter 8: Integrating a Power BI Workspace with Azure Synapse 12. Chapter 9: Perform Real-Time Analytics on Streaming Data 13. Chapter 10: Generate Powerful Insights on Azure Synapse Using Azure ML 14. Section 4: Best Practices
15. Chapter 11: Performing Backup and Restore in Azure Synapse Analytics 16. Chapter 12: Securing Data on Azure Synapse 17. Chapter 13: Managing and Monitoring Synapse Workloads 18. Chapter 14: Coding Best Practices 19. Other Books You May Enjoy

Bringing data to your Synapse SQL pool using Copy Data tool

Copy Data tool makes it very easy to bring your data to Azure Synapse. This is not that different to using the Copy activity of Azure Data Factory, except you do not have to spin up another service for data ingestion in Azure Synapse. You need to make sure you have applied all of the technical requirements before you start following these steps:

  1. Click on Copy Data tool as highlighted in Figure 3.1. This will open a new window where you need to provide the source and destination connection details.
  2. Provide an appropriate name for your pipeline, along with a brief description.
  3. You can choose to run this pipeline once only, or you can schedule it to run regularly. For this example, we are going to schedule our pipeline to run on a daily basis.

    Click on Run regularly on schedule and select the Schedule trigger type.

  4. Provide an appropriate value for Start Date (UTC). This is auto populated with the current date...
lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime}