Demystifying DAX Studio: use cases unrelated to DAX

Just what I needed. Another place to run DAX.

Opening DAX Studio and seeing a blank query pane is intimidating because it seems like the main thing to do is write DAX, then try to rewrite it to run faster, but it's not. Most use cases I have found for DAX Studio have nothing to do with writing DAX, but instead improving my data model.

DAX Studio is a free, open-source tool designed to help you work with Data Analysis Expressions (DAX) queries in Power BI, but it also generates useful metadata about Semantic Models, helps you understand how a DAX query is executed against your data model structure, and more.

Getting started

Get DAX Studio directly from the official website,

Once installed, you can connect to a Power BI semantic model in two ways:

  • Local Semantic Model: Open your PBIX in Power BI Desktop and navigate to the External Tools menu. DAX Studio will launch pre-connected.
  • Semantic Model hosted in Power BI Service: Open the Settings page of the target Semantic Model in the Power BI Service, then the Server settings. Copy the Connection string (Data Source=powerbi:// to your clipboard. Open DAX Studio manually, which will bring up a Connect panel. Select the Tabular Server radio button, then paste the Connection string in the input box and click Connect.

Use Cases

Analyze and optimize the data model

Setup, viewing the semantic model summary

  1. Click the Advanced menu at the top, then View Metrics. A new tab appears at the bottom called VertiPaq Analyzer. Click VertiPaq Analyzer.
  2. Under VertiPaq Analyzer, click the Summary menu. This gives you a starting point to understand whether your Semantic Model is large enough (Semantic model size in memory) to be worthwhile to invest time in reducing its size. Revisiting the size after refinements also gives you a way to measure progress in reducing Semantic Model size.

Model Metrics 

Below, we will step through the metadata here and what to look for.


This is your default view.
  • Focus on the tables that take the most space ("Total Size") for optimization.
  • For Tables, the Cardinality column contains the row count.
  • The RI Violations column also gives you a preview of a referential integrity issue to check into later, under Relationships. (Hover over any column header for a brief definition throughout.) 
  • "LocalDate…" tables indicate AutoDateTime is being used, which should be disabled.


This view contains all columns in one list, regardless of the table.
  • Prioritize the largest columns ("Total Size") for optimization.
  • Don't overly normalize, remove what fields you can, and reduce the distinct values of the fields that remain.


This view shows properties of the relationships.
  • First, review the relationship cardinalities, which should generally all be many-to-one.
  • Next, review the key field cardinalities. Two items to note regarding the cardinalities are:
    1. A very high From Cardinality is likely to be a slower relationship for performance
    2. An excessively high To Cardinality (higher than the From Cardinality) may be mean you are carrying extraneous reference data, ex., a full Patient dimension table with Patients for all-time when your fact table only has Patient transactions for the last year.
  • Lastly, the Missing Keys, Invalid Rows, and Sample Violations provide information about missing reference data -- the inverse of the last example with the Patient dimension. This indicates that facts will not roll up to any dimensional values from a reference data table.

Measure and understand query performance

Besides simply measuring the run time of a query -- essential to measure progress when tuning performance -- DAX Studio helps you understand the query logic behind your visuals in a way you can actually understand (i.e., not DAX, which can be hard to understand). Here's how to analyze a specific visual's DAX query:

  1. Open the Power BI Desktop report and navigate to the Performance Recorder under the Optimize menu.
  2. Start recording, then open the page with your target visual. Stop recording once done.
  3. Locate the relevant activity in the list and copy the DAX query. Note: Selecting a step will also highlight the visual, giving you some feedback that you have found the relevant query.
  4. Launch DAX Studio and paste the query into the query pane.
  5. Click Server Timings (opens Server Timings tab), then Clear Cache, then Run.

 Click the Server Timings tab to review what it captured about your query -- or really, the data model:

  1. First, note whether multiple SE (Storage Engine) queries were required. This can happen with complex DAX functions or expressions or as a result of complex data models, such as those containing many-to-many relationships.
  2. Next, look at the individual queries by selecting each to display xmSQL output in the right pane. This is a SQL-like representation of the requests that the Formula Engine sent to the Storage Engine, so it's easier to understand the work that Power BI is doing based on the combination of the data model structure and DAX query running against it.

Server Timings


Two lesser known features

Impact analysis

Right-click a table, column, or measure in your Semantic Model, then "Show Objects That Reference Data". DAX Studio automatically generates and runs a query that returns a list of dependencies. (It queries a DMV. See next section for more on these.)
For example, if the grain of a dimension table is changing, you can quickly and comprehensively see all of the measures that refer to the dimension table to gauge whether they would still be valid after changing the grain, ex., looking for functions that count rows or columns in the table.

Dynamic Management Views (DMVs)

Click the DMV menu to see a list of all of the system tables (queryable with SQL) you may want to use to learn about the connected semantic model. Drag and drop the view into the query pane to generate a query against it.
One of the DMVs provides the data for the impact analysis, but there are so many others to explore that it's worth familiarizing yourself with them so you'll know when you can leverage them to work smarter, not harder.
You could use this information to build a schema/report with better visualizations of what is found in the VertiPaq Analyzer, plus encoding insights that could be applies to any Semantic Model you connect to, in a repeatable, automated way.

Coincidentally, Microsoft just released a lot of these DMVs as DAX functions, so another interesting use case might be to build a descriptive interface to browse metadata when you create a Semantic Model to serve up data for thin reports instead of leaving the report pages completely blank.


Contact Form


Email *

Message *