Connecting to Fabric Lakehouse, Power BI Datamart, and PowerApps Dataverse’s SQL Server from Desktop Tools and External Services

The main appeal of Power BI’s new Datamart capability is arguably the fact it’s including a SQL Server instance that’s optimized for analytical purposes and is fully managed, yet still exposes its content to the outside via a SQL endpoint. This is a welcome change from the approach Microsoft took with dataflows, whose underlying data lake storage is not exposed to the outside unless you Bring Your Own Storage, a fairly tedious exercise that adds costs on top of your Power BI licenses.

Your connection string will look something like this:

Armed with this string you’d think you can connect easily to the database, but it turned out to be hit-and-miss as of the early Public Preview circa June 2022. At the end of January 2023 Microsoft announced that connectivity had extended to any SQL client, which I have been able to corroborate with help from Product Manager Salil Kanade (thanks!). So let’s look at the many ways you can access your datamart’s database.

While I haven’t tested all these tools as extensively with the Fabric Lakehouse yet, I’d expect you to have a similar experience in most cases. The Lakehouse SQL endpoint will look something like:

1. Power BI Datamart SQL Desktop & Cloud Connectivity Options

1.1. Power BI Desktop

You can easily connect from Power BI Desktop after choosing SQL Server as the source (Sql.Database) or Azure SQL (Sql.Databases) then authenticating with an organizational (i.e. Microsoft) account that has access to the datamart.

The June 2022 version of Power BI Desktop added connectivity to Datamarts via the user interface, but this originally only listed the auto-generated Direct Query datasets, not their underlying databases. If you use this connector you’ll create a Live Connection report (or a Direct Query connection against the dataset in case you turn your model into a composite one), not a full-fledged data model.

Following the July 2022 release, browsing datamarts from Get Data > Datamarts surfaces up data hub info and lets you choose whether to connect to the datamart’s dataset or to its SQL endpoint. When you choose the latter, it previously used the Sql.Databases connector. However, As of the March 2023 Power BI Desktop version, this is now using a new undocumented PowerBI.Datamarts connector. Unlike the regular SQL connectors, this one doesn’t offer the option to insert a SQL statement.

Well done!

1.2. Other Desktop tools

  • You can easily connect from Excel Desktop.
  • Azure Data Studio wouldn’t work until I updated it to the latest version (1.36.2 at the time), at which point the connection could be established. No need to specify the port, use AAD (Universal with MFA support) authentication, leave everything else to default or empty.
  • SSMS 18 GA works as does SSMS 19 GA (SSMS 19 Preview did not because of a bug).
  • You can now set up an ODBC DSN with the latest v18 SQL ODBC driver.
  • I was able to connect with Dbeaver using the SQL Server driver with Active Directory MFA authentication after installing the following artifacts from Database > Driver Manager:



Make sure to show all database and schemas, not just master. This works with both Power Bi datamarts and Fabric data warehouses.

If you can’t connect with any of the aforementioned desktop tools then check that your local firewall is not blocking port 1433. Note that Private Endpoints are not currently supported.

1.3. Cloud Services

  • I could connect to both the datamart and lakehouse SQL endpoints without a hitch from a Power BI dataflow as well as a Fabric dataflow Gen2. Do I want to put a dataflow downstream of a datamart? Not sure about that, but that’s a consideration for another day.
  • In a PowerApps dataflow I was able to connect, even across tenants, as of January 2023. This used to fail back in mid 2022. Make sure to sign in when you set up the connection, you can use “master” as the database name.
  • You can connect from Power Automate, use Azure AD Integrated as the authentication type for your SQL connection. In the flow you’ll need to paste your server name and the long database name, it won’t offer an option to auto-populate. There are no datamart-specific triggers or actions as of February 2023.
    • A datamart’s database is not listed under “SQL Databases” in the Azure portal.
  • Power BI is not listed in the list of resources that support managed identities or service principals, so you can’t connect from an Azure Function or Azure Data Factory, but that may change in the future.

Regardless of the tool, once you connect you’ll see a database with a somewhat ugly name. Within it, you’ll see nothing under Tables, everything is exposed via Views where you’ll find your data (prefaced with the “model” schema) as well as metadata about any relationships you might have created in the service.

You’ll only be able to run DQL queries so you won’t be able to, say, create your own views via SQL. People have been clamoring for DDL and DML support, and it sounds like that’s coming soon.

I’ve been advising Power BI developers to get some SQL performance-tuning skills, it will be interesting to see what’s automatically managed for you vs. what you might be able to audit or even control yourself.

Finally, a significant limitation is that the datamart is not visible via the XMLA endpoint so you can’t work with it programmatically or via third-party tools such as Tabular Editor.

1.4. So Which Tool Should I Use? It Depends on How Deep You Want to Go


For ad hoc queries you can use the Visual Editor in the datamart service – which generates SQL queries via Power Query – or write your own SQL queries. You can access visual queries by right clicking on your step and selecting “View data source query”. It’s now possible to save visual and SQL queries, which was not the case at Preview launch. That said, you can be more productive using a desktop tool.

Azure Data Studio is a good choice as it’s free, relatively lightweight, and cross-platform. At first I thought that SSMS doesn’t add a lot of value in this instance as its administrative and user management tools appear mostly irrelevant to the datamart’s already tightly managed SQL server. I then noticed that Query Store is enabled by default in a datamart, hinting that there are use cases where SSMS is superior to ADS for datamarts.

I also tested SSMS’s SQL Server Import and Export Wizard as well as Generate Scripts, and found that both work to automate the copying of objects from your datamart’s database into an on-premises server (see this community thread for details).

The Datamart’s database is managed for you, but it’s not an entirely opaque black box. Pictured here: Query Store.

In Azure Data Studio you can save queries individually as .sql files or put your favorites in a SQL notebook like pictured below:

2. Bonus: Dataverse Connectivity Options

I like to think of Dataverse as Datamart’s peer on the Power Apps side, as they’re both SaaS services meant to provide easy-to-provision SQL options for business users. Dataverse takes care of OLTP use cases while Datamart is an OLAP package.

Dataverse comes with a TDS (SQL) endpoint which allows read-only queries from:

  • SSMS 18.4+
  • Power BI dataflows
  • Power BI datasets
  • Power BI datamarts
  • Excel Desktop

Azure Data Studio is not currently supported though someone recently filed a feature request. I was not able to connect using Dbeaver either.

Can’t do this with the Dataverse connector

Using the SQL connector rather than the Dataverse connector in Power BI has a couple of benefits:

  • You can include a SQL statement in your source step.
  • You can retrieve relationship columns in case you don’t know how tables are connected or need to denormalize several tables into one.
  • You can add failover support in case you replicate your Dataverse database in a SQL Server (presumably, as I never tested this).

The “Navigate using full hierarchy” option on the other hand fails with this error: “Invalid table name SCHEMATA for schema INFORMATION_SCHEMA”

That said for larger query results (above 80MB) you’ll want to avoid using the SQL endpoint. It’s a good exploratory tool for databases you’re just getting acquainted with, provided you have a modicum of SQL skills.

Like with Datamart, only Azure Active Directory authentication is supported, i.e. you can’t use SQL and obviously not Windows authentication.

Note that Dataverse SQL only supports a subset of T-SQL data types, statements, and functions.

If you wonder was TDS means: Tabular Data Stream (TDS) Protocol is “an application-level protocol used for the transfer of requests and responses between clients and database server systems. In such systems, the client will typically establish a long-lived connection with the server. Once the connection is established using a transport-level protocol, TDS messages are used to communicate between the client and the server.”

Finally, because Microsoft likes to make a spaghetti dish out of their product line, Dataverse can be loaded in Power BI via Azure Synapse Link which will store data in Azure Data Lake Storage Gen 2. Why would you ever want to do that? Here’s a hint:

“The Azure Synapse Link for Dataverse service supports initial and incremental writes for table data and metadata. Any data or metadata changes in Dataverse are automatically pushed to the Azure Synapse metastore and Azure Data Lake, depending on the configuration, without any additional action. This is a push, rather than pull, operation. Changes are pushed to the destination without you needing to set up refresh intervals.”

One thing is sure, your Microsoft Account Manager will love you if you contract Power Apps + Synapse + Power BI Premium…

It also looks like that’s the way to go in case you want to ingest Dataverse data with Azure Data Factory.

Leave a Reply

Your email address will not be published. Required fields are marked *