DP-900 : Microsoft Azure Data Fundamentals Practice Tests
Candidates for this exam should have foundational knowledge of core data concepts and how they are implemented using Microsoft Azure data services.
This exam is intended for candidates beginning to work with data in the cloud.
Candidates should be familiar with the concepts of relational and non-relational data, and different types of data workloads such as transactional or analytical.
Azure Data Fundamentals can be used to prepare for other Azure role-based certifications like Azure Database Administrator Associate or Azure Data Engineer Associate, but it’s not a prerequisite for any of them.
Describe core data concepts (15-20%)
Describe types of core data workloads · describe batch data · describe streaming data · describe the difference between batch and streaming data · describe the characteristics of relational data Describe data analytics core concepts · describe data visualization (e.g., visualization, reporting, business intelligence · describe basic chart types such as bar charts and pie charts · describe analytics techniques (e.g., descriptive, diagnostic, predictive, prescriptive, cognitive) · describe ELT and ETL processing · describe the concepts of data processing
Describe how to work with relational data on Azure (25-30%)
Describe relational data workloads · identify the right data offering for a relational workload · describe relational data structures (e.g., tables, index, views) Describe relational Azure data services · describe and compare PaaS, IaaS, and SaaS delivery models · describe Azure SQL Database · describe Azure Synapse Analytics · describe SQL Server on Azure Virtual Machine · describe Azure Database for PostgreSQL, Azure Database for MariaDB, and Azure Database for MySQL · describe Azure SQL Managed Instance Identify basic management tasks for relational data · describe provisioning and deployment of relational data services · describe method for deployment including ARM templates and Azure Portal · identify data security components (e.g., firewall, authentication) · identify basic connectivity issues (e.g., accessing from on-premises, access with Azure VNets, access from Internet, authentication, firewalls) · identify query tools (e.g., Azure Data Studio, SQL Server Management Studio, sqlcmd utility, etc.) Describe query techniques for data using SQL language · compare DDL versus DML · query relational data in PostgreSQL, MySQL, and Azure SQL Database
Describe how to work with non-relational data on Azure (25-30%)
Describe non-relational data workloads · describe the characteristics of non-relational data · describe the types of non-relational and NoSQL data · recommend the correct data store · determine when to use non-relational data Describe non-relational data offerings on Azure · identify Azure data services for non-relational workloads · describe Azure Cosmos DB APIs · describe Azure Table storage · describe Azure Blob storage · describe Azure File storage Identify basic management tasks for non-relational data · describe provisioning and deployment of non-relational data services · describe method for deployment including ARM templates and Azure Portal · identify data security components (e.g., firewall, authentication) · identify basic connectivity issues (e.g., accessing from on-premises, access with Azure VNets, access from Internet, authentication, firewalls) · identify management tools for non-relational data
Describe an analytics workload on Azure (25-30%)
Describe analytics workloads · describe transactional workloads · describe the difference between a transactional and an analytics workload · describe the difference between batch and real time · describe data warehousing workloads · determine when a data warehouse solution is needed Describe the components of a modern data warehouse · describe Azure data services for modern data warehousing such as Azure Data Lake, Azure Synapse Analytics, Azure Databricks, and Azure HDInsight · describe modern data warehousing architecture and workload Describe data ingestion and processing on Azure · describe common practices for data loading · describe the components of Azure Data Factory (e.g., pipeline, activities, etc.) · describe data processing options (e.g., HDI, Azure Databricks, Azure Synapse Analytics, Azure Data Factory) Describe data visualization in Microsoft Power BI · describe the role of paginated reporting · describe the role of interactive reports · describe the role of dashboards · describe the workflow in Power BI
Who this course is for:
- Beginners Students who want to become Data Engineer
- Students Who would like to clear the basics in Data Fundamentals and eventually appear for DP-200 and DP-201
- Your beginning step for Azure Data Engineer
We are working as Azure DevOps Engineer .Responsible for automated deployment process ,timely build release on UAT and PROD environment ,with 8 Years of Experience as a DevOps & Build Release Engineer and Trainer. 5000+ happy students
Cleared Azure DevOps AZ 400 (75%) and AZ 900 exams (93%)
I am MCT certified for three years . Hold 13 + certifications
• Proficient in Requirement gathering/analysis
• Effective communicator who enjoys building and maintaining client relationships
• International Experience of working with US & SPAIN Client.
• Release build and Deployment experience. Provided support in PRODUCTION/UAT Environment.
• Ability to multi-task and prioritize work as per deadlines.
• To pursue a challenging and respectable career in an organisation by learning, growing and
• Worked on AWS EC2 ,ElasticBeanStalk ,S3 ,VPC ,Migrations ,AWS services .
• Responsible for creating branches after very PSI
• Created Jenkins pipleline for daily automated deployments .
• Used powershell , python plugins in Jenkins as script for deploying applications
• Managing PRODUCTION deployments for eight applications , troubleshooting & closing
deployments under deployment window .
• Experience with CI Tools for build and deployment.
• Following an agile development environment and familiarity with Agile/SCRUM principles.
• Worked on in-built deployment tracking tool .