DP-600 Certification
Contents
Sign up for the FREE 4-hour workshop on December 13th
Subscribe and get informed about the latest updates
DP-600 exam (Fabric Analytics Engineer) covers a broad range of skills and tools related to Microsoft Fabric. Think of components and concepts such as:
- Data lakehouse
- Eventhouse
- Data warehouse
- Data modeling
- Data transformation
- Notebooks
- Dataflows Gen2
- Semantic model
I’ve created this page with the intention to keep all the learning resources in one single place. I hope that everyone who’s preparing to take this exam will find the page useful and that will in the end help you master all the necessary skills to become a certified Fabric Analytics Engineer.
Table of contents
Maintain a data analytics solution (25–30%)
Implement security and governance
- Implement workspace-level access controls
- Implement item-level access controls
- Implement row-level, column-level, object-level, and file-level access control
- Apply sensitivity labels to items
- Endorse items
Maintain the analytics development lifecycle
- Configure version control for a workspace
- Create and manage a Power BI Desktop project (.pbip)
- Create and configure deployment pipelines
- Perform impact analysis of downstream dependencies from lakehouses, data warehouses, dataflows, and semantic models
- Deploy and manage semantic models by using the XMLA endpoint
- Create and update reusable assets, including Power BI template (.pbit) files, Power BI data source (.pbids) files, and shared semantic models
Prepare data (45-50%)
Get data
- Ingest or access data as needed
- Create a data connection
- Discover data by using OneLake data hub and real-time hub
- Ingest or access data as needed
- Choose between a lakehouse, warehouse, or eventhouse
- Implement OneLake integration for eventhouse and semantic models
Transform data
- Create views, functions, and stored procedures
- Enrich data by adding new columns or tables
- Implement a star schema for a lakehouse or warehouse
- Denormalize data *** Watch on YouTube
- Aggregate data
- Merge or join data
- Identify and resolve duplicate data, missing data, or null values
- Convert column data types
- Filter data
Query and analyze data
- Select, filter, and aggregate data by using the Visual Query Editor
- Select, filter, and aggregate data by using SQL
- Select, filter, and aggregate data by using KQL
Implement and manage semantic models (25-30%)
Design and build semantic models
- Choose a storage mode *** Watch on YouTube
- Implement a star schema for a semantic model *** Watch on YouTube
- Implement relationships, such as bridge tables and many-to-many relationships
- Write calculations that use DAX variables and functions, such as iterators, table filtering, windowing, and information functions
- Implement calculation groups
- Implement dynamic format strings
- Implement field parameters *** Watch on YouTube
- Identify use cases for and configure large semantic model storage format *** Watch on YouTube
- Design and build composite models
Optimize enterprise-scale semantic models
- Implement performance improvements in queries and report visuals
- Improve DAX performance
- Configure Direct Lake, including default fallback and refresh behavior
- Implement incremental refresh for semantic models *** Watch on YouTube