Simple Introduction

Azure Data Factory

ADF Series Part 01 -What is that wizard?

Azure Data Factory

Where are we?

As we are in an inevitable stage of modernization and digital transformation as an enterprise we need to be ready to move or bring the data from different sources with varying frequency and various formats into the cloud!

Modernization!

Why data?

O! Yes, Data is the core of your business insights, business intelligence, Machine Learnings, Artificial Intelligence.

Take a look at the current situation, Covid 19, (yes!! article is written during the pandemic 2020! )

With gargantuan data collected using,

· Real-time trackers

· Thermal sensors

· Scanners

· Custom build apps to monitor acquaintance interaction patterns.

Data collected from these resources are being used to aggregate, determine, and synthesize usable data at a global level to create a Machine learning model, fed into AI tools in order to track, prevent, and also to help in designing the vaccines.

Data not only new oil, it actually drives!

Why Cloud?

Currently, we don’t have a much better way to handle the scale of data not only to store, but we also have to process it and produce the results in a hasty fashion!! (On par with the scalability of covid)

Data & Cloud:

data stages
data stages
Different stages of DATA!

When we talk about data and the power of it, you have to collect, store, transform, integrate, and prepare your data to extract its power.

That’s where we need a wizard where we can able to automate, orchestra and manage the process of data. ADF is one of the platforms that play the role of the wizard for all the stages of data.

ADF — The Wizard

Official Definition of ADF from docs:

“It is the cloud-based ETL and data integration service that allows you to create data-driven workflows for orchestrating data movement and transforming data at scale. “

Also its Code-free for ETL/ETL/Data Integrations projects,

Azure Data Factory is the cloud-based ETL and data integration service that allows you to create data-driven workflows for orchestrating data movement and transforming data at scale. Using Azure Data Factory, you can create and schedule data-driven workflows (called pipelines) that can ingest data from disparate data stores inside the Azure platform as well it will bring in data from servers outside Azure like on-premise SQL server or any other cloud services like salesforce with its varied connectors available(currently 90+ connectors).

Does that sound interesting? Are you in?. Let’s go!

Conclusion

It's a simple introduction of ADF and wherein the data and cloud world it gets fit. As we progress on the series we will be deep diving more on its functionality by making some hands-on along the way!!

Vijay

https://twitter.com/vijayganss

www.linkedin.com/in/vijayganeshs

Big Data/Cloud Guy.. Mostly working on Azure/GCP