Data Mozart

DP-600 Certification

Sign up for the FREE 4-hour workshop on December 13th

Subscribe and get informed about the latest updates

DP-600 exam (Fabric Analytics Engineer) covers a broad range of skills and tools related to Microsoft Fabric. Think of components and concepts such as:

  • Data lakehouse
  • Eventhouse
  • Data warehouse
  • Data modeling
  • Data transformation
  • Notebooks
  • Dataflows Gen2
  • Semantic model

I’ve created this page with the intention to keep all the learning resources in one single place. I hope that everyone who’s preparing to take this exam will find the page useful and that will in the end help you master all the necessary skills to become a certified Fabric Analytics Engineer.

Table of contents

Maintain a data analytics solution (25–30%)

Implement security and governance

  • Implement workspace-level access controls
  • Implement item-level access controls
  • Implement row-level, column-level, object-level, and file-level access control
  • Apply sensitivity labels to items
  • Endorse items

Maintain the analytics development lifecycle

  • Configure version control for a workspace
  • Create and manage a Power BI Desktop project (.pbip)
  • Create and configure deployment pipelines
  • Perform impact analysis of downstream dependencies from lakehouses, data warehouses, dataflows, and semantic models
  • Deploy and manage semantic models by using the XMLA endpoint
  • Create and update reusable assets, including Power BI template (.pbit) files, Power BI data source (.pbids) files, and shared semantic models

Prepare data (45-50%)

Get data

  • Ingest or access data as needed
  • Create a data connection
  • Discover data by using OneLake data hub and real-time hub
  • Ingest or access data as needed
  • Choose between a lakehouse, warehouse, or eventhouse
  • Implement OneLake integration for eventhouse and semantic models

Transform data

Query and analyze data

  • Select, filter, and aggregate data by using the Visual Query Editor
  • Select, filter, and aggregate data by using SQL
  • Select, filter, and aggregate data by using KQL

Implement and manage semantic models (25-30%)

Design and build semantic models

Optimize enterprise-scale semantic models

Resources