DP-200: Implementing Azure Data Solution Exam Prep (DP-201+)
4.0 (47 ratings)
Course Ratings are calculated from individual students’ ratings and a variety of other signals, like age of rating and reliability, to ensure that they reflect course quality fairly and accurately.
259 students enrolled

DP-200: Implementing Azure Data Solution Exam Prep (DP-201+)

Pass DP-200 & Dp-201: Learn Azure SQL Database,CosmosDB, Data warehouse, Data Lake, Data Factory, Data Bricks in 9 hours
4.0 (47 ratings)
Course Ratings are calculated from individual students’ ratings and a variety of other signals, like age of rating and reliability, to ensure that they reflect course quality fairly and accurately.
259 students enrolled
Created by Sarafudheen PM
Last updated 5/2020
English
Current price: $12.99 Original price: $19.99 Discount: 35% off
2 hours left at this price!
30-Day Money-Back Guarantee
This course includes
  • 9 hours on-demand video
  • 3 articles
  • 23 downloadable resources
  • 1 Practice Test
  • Full lifetime access
  • Access on mobile and TV
  • Certificate of Completion
Training 5 or more people?

Get your team access to 4,000+ top Udemy courses anytime, anywhere.

Try Udemy for Business
What you'll learn
  • You Will Master Azure Data Factory
  • How To Create Data Flows And Control Flows In Azure Data Factory
  • How To Use Parameters and Variables In Azure Data Factory
  • How To Implement Different Data Storage Solutions In Azure
  • implement a solution that uses Data Lake Storage Gen2
  • How To Create Azure SQL Databases
  • How To Configure Azure Data Bricks( A Service To Perform Big Data Analysis Using Azure Cloud)
  • How To Access Azure Data Lake From Azure DataBricks (With Python Code)
  • How To Connect To Azure SQL Data Base From On premise Using SSMS
  • How To Execute Query Against Azure SQL Data Base Using Azure Query Editor
  • How Migrate Data Base From On Premise To Azure Using SSMS
  • How To Move Data To Azure Data Lake From On-premise Using Storage Explore
  • How To Access On premise Environment Using Self Hosted Integration Run time From ADF
  • How To Create Data Set and Connections In ADF (Azure Data Factory)
  • How To Create Pipeline In Azure Data Factory (ADF)
  • How To Implement a solution that uses Azure Blob storage
  • Implement Storage Systems With High Availability, Disaster Recovery, And Global Distribution With Geo Replications
  • How To Create Elastic Pool And Deploy Multiple Data Bases In single Server
  • Provide Access To Data To Meet Security Requirements
  • How To Create Azure SQL Data Warehouse (Azure Synapse Analytics)
  • configure elastic pools, configure geo-replication, implement PolyBase
  • Learn to use PolyBase external tables to load data from Azure Data Lake Storage
  • Create database objects required to load from Data Lake Storage
  • Connect to a Data Lake Storage directory (With Credential and secret)
  • implement Copy Activity within Azure Data Factory, create linked services and datasets
  • Develop batch processing solutions using Data Factory and Azure Databricks
  • Create pipelines and activities, implement Mapping Data Flows in Azure Data Factory
  • Learn to implement Azure Databricks clusters creations
  • Learn to ingest data into Azure Databricks
  • Learn to implement Azure Databricks notebooks, jobs, and autoscaling,
Requirements
  • Internet connection to watch this videos
  • Laptop or PC to practice this LAB (Optional)
  • Headphone For Audio
Description

Microsoft Azure (formerly Windows Azure) is a cloud computing service created by Microsoft for building, testing, deploying, and managing applications and services through Microsoft-managed data centers.

Azure Provides Three services:

  1. software as a service (SaaS),

  2. platform as a service (PaaS).

  3. infrastructure as a service (IaaS)


Azure supports many different programming languages, tools, and frameworks, including both Microsoft-specific and third-party software and systems. Azure was announced in October 2008, started with the codename "Project Red Dog", and released on February 1, 2010, as "Windows Azure" before being renamed "Microsoft Azure" on March 25, 2014.

In this course, you will learn :

  1. How To Use The Azure Data Factory.

  2. How To Use The Azure SQL Database.

  3. How To Use Azure Blob Storage.

  4. How To Use Azure Data Lake.

  5. How To Use Azure DataBricks.

  6. How To Use Different Azure Data Services In Different applications.

Exam DP-200: Implementing an Azure Data Solution:

Candidates for this exam must be able to implement data solutions that use the following Azure services:

  • Azure Cosmos DB.

  • Azure SQL Database.

  • Azure Synapse Analytics (formerly Azure SQL DW).

  • Azure Data Lake Storage.

  • Azure Data Factory.

  • Azure Stream Analytics.

  • Azure Databricks.

  • Azure Blob storage.

Topics We Cover In This Course:

  • Azure SQL Database.

  • Azure Cosmos DB.

  • Azure Data Lake Storage.

  • Azure Data Factory.

  • Azure Databricks.

  • Azure Blob storage.

  • Azure Synapse Analytics (formerly Azure SQL DW).


Upcoming Modules;

  • Azure SQL failover groups

  • Azure Data Lake Analytics

  • Introduction To Power BI

  • HDinsight.


Candidates for Exam DP-200: Implementing an Azure Data Solution are Microsoft Azure data engineers who identify business requirements and implement proper data solutions that use Azure data services like Azure SQL Database, Azure Cosmos DB, Azure Data Factory, Azure Databricks, Azure data warehouse (Azure Synapse Analytics)

This course covers, how to provisioning data storage services like Azure SQL, Storage account, Data lakes. In the Azure Data factory section, we cover how to transform your data, identifying performance bottlenecks, and accessing external data sources including on-premise SQL server and file systems.


Azure Data Factory (ADF):

The Azure Data Factory service is a fully managed service for composing data storage, processing, and movement services into streamlined, scalable, and reliable data production pipelines. The Azure Data Factory (ADF) is a service designed to allow developers to integrate disparate data sources. ADF or Azure Data Factory is a platform somewhat like SSIS or Alteryx in the Azure environment to manage the data you have both on-prem and in the cloud.

It provides access to on-premises data with the help of a software. By using this link we could connect to the on-premise file system as well as to on-premise SQL databases. From Azure Data Factory, you could access almost all azure services without any difficulties.

Access to on-premises data is provided through a data management gateway that connects to on-premises SQL Server databases and we will show you how to install this software and how to connect your on-premises environment with Azure cloud.

If you ever created any data transfer activities in Azure, or in SSIS, you will find it a similar tool. If you use ADF, you could focus on your data—the serverless integration service does the rest.


Topics In Azure Data Factory:

  • Append Variable Activity

  • Execute Pipeline activity

  • ForEach activity

  • Get Metadata activity

  • If Condition activity

  • Lookup activity

  • Set variable activity

  • Until activity

  • Validation activity

  • Data Flow activity

  • Mapping data flow

    • Aggregate transformation.

    • Alter row transformation.

    • Conditional split transformation.

    • Derived column transformation.

    • Exists transformation.

    • Join transformation.

    • Lookup transformation.

    • The new branch mapping data flow transformation.

    • Select transformation.

    • Sink transformation.

    • Source transformation.

    • Azure Data Factory union transformation.

  • Parameterizing

  • Trigger In Azure Data Factory.

    • Manual Trigger.

    • Scheduled Trigger.

    • Tumbling window

    • Event Trigger.

      • Dynamic Data processing And Pipeline Execution Based on External Event.

  • and many more (with real-life scenarios). Check out our course descriptions for updated information.

SQL Database -Cloud Database as a Service:

Azure SQL Database is a fully managed relational database with built-in intelligence supporting self-driving features such as performance tuning and threat alerts. According to Wiki, Microsoft Azure SQL Database is a managed cloud database provided as part of Microsoft Azure. A cloud database is a database that runs on a cloud computing platform, and access to it is provided as a service. Managed database services take care of scalability, backup, and high availability of the database.

Azure SQL Database: Azure SQL Database is a relational database-as-a-service (DBaaS) based on the latest stable version of Microsoft SQL Server. It is a fully managed Platform as a Service (PaaS) Database Engine that handles most of the database management functions such as upgrading, patching, backups, and monitoring without user involvement.

In this course, we will show you how to launch the Azure SQL database in five minutes, with and without sample data.  We will show you, how to use SQL elastic pools.

Elastic pools help you manage and scale multiple Azure SQL databases. According to Azure documentation, SQL Database elastic pools are a simple, cost-effective solution for managing and scaling multiple databases that have varying and unpredictable usage demands. With an elastic pool, you determine the amount of resources that the elastic pool requires to handle the workload of its databases, and the amount of resources for each pooled database.

Azure SQL Database is always running on the latest stable version of the SQL Server Database Engine and patched OS with 99.99% availability.

The databases in an elastic pool are on a single Azure SQL Database server and share a set number of resources at a set price. By the end of this course, you will have a clear idea about how to configure SQL elastic pool.

Geo-Replication:

Active geo-replication is an Azure SQL Database feature that allows you to create readable secondary databases of individual databases on a SQL Database server in the same or different data center (region). We will show you how you could configure a Geo-replication and force failover to the secondary database manually.

Azure Cosmos DB:

Azure Cosmos DB is Microsoft's globally distributed, multi-model database service. With a click of a button, Cosmos DB enables you to elastically and independently scale throughput and storage across any number of Azure regions worldwide. In our course, we will see.

  • How you could create a cosmos DB account,

  • How to Create Databases inside your cosmosDB account

  • How to insert data into CosmsoDB containers.

  • How to Restive data that you saved in cosmos DB tables or containers by using SQL

Introduction to Azure Storage:

Azure Storage is Microsoft's cloud storage solution for modern data storage scenarios. Azure Storage offers a massively scalable object store for data objects, a file system service for the cloud, a messaging store for reliable messaging, and a NoSQL store.

In this course, we will cover how to create a storage account, how to create containers and file systems and how to upload data into these services and how to access these storage services from different azure data solutions services like data factory, data bricks, and SQL databases.

Azure Data Lake Storage:

Azure Data Lake Storage, is a fully-managed, elastic, scalable, and secure file system that supports HDFS semantics and works with the Hadoop ecosystem. ‎Azure Data Lake Storage Gen2 is a set of capabilities dedicated to big data analytics, built on Azure Blob storage. Data Lake Storage Gen2 is the result of converging the capabilities of our two existing storage services, Azure Blob storage and Azure Data Lake Storage Gen1.

In this course we will see, how to create data lakes, How to move CSV data from azure blob storage to azure data lake using azure data factory. How to read your data (Azure Data lake) using azure Databricks.

Azure Databricks:

Azure Databricks provides the latest versions of Apache Spark and allows you to seamlessly integrate with open source libraries. Spin up clusters and build Bigdata applications. According to Databricks documents, Azure Databricks is a fast, easy, and collaborative Apache Spark™ based analytics platform optimized for Azure. Azure Databricks supports Python, Scala, R, Java, and SQL, as well as data science frameworks and libraries including TensorFlow, PyTorch, and scikit-learn.

In this course, we will show you how to configure Azure Data bricks, How to launch a cluster, how to create notebooks.

Azure Synapse Analytics (Azure SQL Data Warehouse):

Azure Synapse is a limitless analytics service that brings together enterprise data warehousing and Big Data analytics. Azure Synapse is Azure SQL Data Warehouse evolved. In this course, you will learn how to create an Azure SQL pool, access a data lake storage account (how to use PolyBase external tables to load data from Azure Data Lake Storage). Will demonstrate how to create a master key and database scoped credential. How to create external tables and external data sources. Finally, we will see how to load data into Azure Data Warehouse from an external table by using the create table as a select command.

Demo 1:

  1. Create a Pipeline in Azure Data Factory.

  2. Create Input Connections to a source.

  3. Create Input Data Set.

  4. Create Output Connections to destinations.

  5. Create an Output Data set.

  6. Create A copy Activities to copy data from on-premise to Azure Blob storage.

  7. Create A copy Activities to copy data from Blob to Azure SQL Database.

  8. Create A copy Activities to copy data from on-premise to SQL Database.

  9. Run Your Copy Activities and validate all the settings

Demo-2:

  1. Migrate Data from On-premise SQL Server to Azure SQL database Without any external services.

Demo 3:

  1. Create An Azure DataBricks.

  2. Connect To Azure Data Lake.

  3. Create a Cluster To Run our notebook

  4. Configure Azure Databricks Data lake configurations.

  5. Assign permission to your external Applications.

  6. Read CSV data saved inside Azure Data lake using a Python notebook.

Demo 4:

  1. Create Your First Data Flow In Azure Data Factory.

  2. Configure The Source Data flow.

  3. Learn To use Filter Conditions.

  4. Learn To configure Sink (Destinations) in Azure.

  5. Run your Azure Data flow and copy data from Azure Blob Data Store And Save filtered result In Azure Data lake.

Demo 5:

  1. Access On-premise SQL Server.

  2. Create Different  Data Set by Executing Custom Stored Procedure By Passing Dynamic Parameter

  3. Save This Data Set Into Data Lake By Creating Custom Filename.

  4. Trigger this action Azure Data Factory.

Demo 6:

  1. Create dynamic result with help of custom parameters and in for each activity.

  2. Save the result into Azure Data lake with a dynamic name.

Demo 7;

Run your activities N times with the help of Until loop (Do loop concepts of programming language)

Demo 8:

  • Create an Azure Data warehouse Pool by using Azure Portal.

  • Create an Azure Data warehouse by using SQL statements from SSMS.

  • Connect and execute SQL statements against  Azure SQL Data warehouse.

  • Execute SQL statements against Data in Azure Data lake using PolyBase external tables And load data from Azure Data Lake Storage into Azure Data warehouse.

  • Learn to use external tables and external data sources in Azure SQL Data warehouse.

Upcoming Modules:

    • Azure SQL failover groups.

    • Azure Data Lake Analytics.

    • Introduction To Power BI.

    • HDinsight.

    • Azure stream analytics.



Who this course is for:
  • Any student Who Plan To Take DP 200 Certifications
  • Any Student Who Want To Implement Data Solutions In Azure Cloud
Course content
Expand all 103 lectures 09:04:41
+ Create Different Storage Resources In Azure
11 lectures 48:44
Introduction To RDBMS (What is Relational Data base management Systems & Use)
04:32
Three Deployment Options Supported By Azure SQL DB
01:54
Single DB Deployment Vs Elastic Pool Deployment In Details
05:26
Azure SQL Purchasing Models (DTU / V Core & Serverless )
04:41
Demo: How To Connect To a Database From A SSMS (SQL server management studio)
03:46
Demo: How To Create A Storage Account In Azure
05:52
Different Types Of Configuration In Azure Storage Account
02:52
Data Replications In Azure Storage Account (What is Availability and Durability)
09:08
Azure Storage Redundancy
01:16
+ Introduction To Azure CosmosDB
5 lectures 39:30
What Is Azure CosmosDB
05:59
How To Create A CosmosDB Account
04:05
How To Create Database and Containers In A CosmosDB account
06:03
How To Insert Data Into CosmosDB And Retrieve Data From CosmosDB Using SQL
08:50
How To Access Cosmos DB From .NET (Create DB & Containers , Insert/Update Items)
14:33
+ Introduction To Azure Data Factory
4 lectures 24:11
Demo: How To Create A Azure Data Factory
04:05
Live Demo: Copy Data From On premise To Storage Account With Data Factory
06:33
Live Demo: Copy Data From On premise To Azure Storage Account In ADF
07:04
Live Demo: Copy Data From Azure Storage Account To Azure SQL Database
06:29
+ Azure SQL:- Implement Relational Data Storage Solutions In Azure
7 lectures 28:02
Demo: Migrating On Premise Data Base To Azure SQL Data Base
05:25
Demo: How To Create Elastic Pool (Azure SQL Database Elastic Pool Configuration)
05:13
Demo: Deploy More Databases Into Elastic pool
02:36
Demo: Delete Your Elastic Pool
01:00
Live Demo: Configure Geo Replication For Azure SQL Database
06:59
Demo: Configure Fail Over In Geo Replication-Make Secondary Readable as primary
05:41
+ Azure Data Factory :- Implement Batch Data Processing Solutions In Azure Part 1
9 lectures 32:44
Components Of Azure Data Factory
05:20
Live Demo: Create Resource (Azure Data Factory & Azure SQL)
03:57
Demo: Create A Pipeline To Copy On Premise Data To Azure
01:41
Live Demo: Create Self Hosted Linked Service To Connect To Input Data
05:50
Demo: Create A Data Set (Input Data Set)
02:39
Demo Create Linked Service (Auto Resolve Integration Run time ) To Output
03:27
Demo: Create A New Data Set To Save Your Output
01:30
Demo: Create New Azure Data Factory Copy Activity To Execute A Demo Project
06:42
+ Azure Data Lake:- Implement Azure Data Lake Storage Solutions In Azure
4 lectures 10:38
Introduction
01:17
Live Demo: Create An Azure Data Lake Solution
04:00
Demo: Install Azure Storage Explorer In On Premise
01:43
Demo: Upload Data Into Azure Data Lake Using Azure Storage Explorer
03:38
+ Azure DataBricks:- Implement Batch Data Processing Solutions In Azure Part 2
5 lectures 25:26
Live Demo: How To Create Azure DataBricks Works Space
01:57
Demo: How To Create A Cluster Inside An Azure DataBricks
03:23
Demo: Connect To Azure Data Lake From Azure Data Bricks-Important Configuration
07:09
Demo: Assign Proper Permission Using Access Control And Key Vault
06:05
Demo: Read CSV Stored In Azure Data Lake From Azure Databricks
06:52
+ Azure Synapse Analytics:- Implement RDBMS Storage Solutions- SQL Data Warehouse
10 lectures 32:11
Demo: Create and configure An Azure Synapse Analytics (Create A Data Warehouse)
05:13
Demo: How To Connect To Azure SQL Data Warehouse Using SSMS
03:45
Demo: How To Create Data warehouse From SSMS By Using A SQL Scripts
03:42
Demo: Create A Master Key
01:41
Demo: Create A Database Scoped Credential
01:48
Demo: Create External Data Source
02:27
Live Demo: Create An External File Format
01:36
Live Demo: Create An External Table In Azure Synapse Analytics
04:23
Demo:- Creates A Staging Table (Index: Column Store & Distribution: Round Robin)
03:26
+ Building Data Flows In Azure Data Factory: Mastering Batch Data Processing Tools
18 lectures 01:55:09
Demo: Create A Connection To Azure Data Lake From Data Factory
03:35
Demo: First Data Flow In Azure Data Factory With A Filter Transformations
09:08
Demo: How To Run Your First Data Flow Transformation In Azure Data Factory
05:26
Live Demo: Configuration In Source Data Flow transformations
05:54
Demo: Union Data Flow Transfer In Azure Data Factory-Part 1
07:33
Demo: Union Data Flow Transfer In Azure Data Factory-Part 2
05:44
Demo: Loin Data Flow Transfer In Azure Data Factory
09:50
Demo: Select Data Flow Transformations
06:15
Demo: Derived Column Data Flow Transfer In Azure Data Factory
06:02
Use Of Conditional Split Data Flow Transformations
06:07
Demo: Conditional Split Data Flow Transformation :- How To Use First Match?
11:19
Demo: Conditional Split Data Flow Transformation :- How To Use All Match
06:46
Demo: How To Use Exists Transformation In Mapping Data Flow
06:53
Demo: Exist And Don't Exist In Azure Data Factory
03:44
Demo: How To Use Aggregate Transformation In Mapping Data Flow
05:49