How to create Dashboards on your D365 FnO data in Azure Data lake using Azure Synapse Analytics SQL on-demand Serverless Pool
So you have successfully exported your FnO data in data lake using LCS azure export add-in, now what ? how we can consume our data in data lake to create beautiful dashboards ?
I will not discuss how to export tables to data lake in this article, you can follow below link if you want to know more about this:
If you open up Azure Storage Explorer and view data lake blob, it is a folder structure with mostly .csv data and .json manifest files containing metadata and your FnO data in raw form. There are different ways to consume data from this point including:
I am going to use Synapse Analytics SQL Serverless pool to consume my data, and that's because I can use my T-SQL skills to write powerful code and extract data from tables, the beauty of using Synapse Serverless SQL pool is that it will convert your raw data into SQL views and you can query your tables just like usual, it makes you feel at home when you work in SQL Management Studio and allows you to comfortably write your mashup logic in shape of T-SQL views and Procedures.
So lets start the walkthrough without further delay, you will need to create a Synapse Analytics workspace in same data region which houses your data lake and ERP,
Please note that Synapse Serverless has a cost of USD $5 per TB, this will be charged against your subscription
Create Azure Synapse Workspace
Download CDMUTIL Tool
The magic of converting your raw data into synapse SQL db is being done by a special tool provided by Microsoft, they call it CDMUTIL, the tool is available to download on Github.
Once you have downloaded this tool, you can deploy it in two ways,
We will use Console app method in this tutorial, extract the downloaded files and open config file CDMUtil_ConsoleApp.dll.config. You are going to need following information to be filled in this file:
Syntax of connection string will be as follows:
Recommended by LinkedIn
<add key="TargetDbConnectionString" value="Server=xyzsynapse-ondemand.sql.azuresynapse.net;Initial Catalog=xyzDB;User ID=sqladminuser;Password=xyzPassword" />
If you choose to process some data entities also, two more items you will need to take care:
Execute CDMUTIL
Once your config file is setup, its time to fire up the tool, open a command prompt with admin privileges, locate the CDMUTIL folder and type CDMUtil_ConsoleApp.exe, this will start the tool and you will see a screen similar to this:
After the tool has been executed successfully, you are ready to explore your data lake meta data in SQL management studio or Azure Data studio, Although Data studio is more compatible, I personally prefer SQL Management Studio.
Enter your Serverless endpoint url in Server connection and authenticate using Azure active directory. Expand your database and expand Views Node:
You have all tables from your data lake enlisted as views, now you can create your own views and write queries to be used in visualization tools like Power bi.
Powerbi
Open powerbi, click Get Data, search for Azure Synapse Analytics SQL data source and click Connect, Enter your Serverless endpoint in Server, database name(optional) in database, and choose Import as query mode, you will see your tables under dbo node in power bi navigator, select your required tables or views.
Following is an example of dashboard I built in Power bi:
Microsoft has come a long way fine tuning their BI reporting options for cloud based ERP, finally we can see an option where we can focus on building actual reports instead of integrations and data sources, it is almost seamless now (Near realtime feature is still in public preview at the time of writing this post).
Some Tips:
D365 Finance & Operations Technical Consultant; A365 Technical Consultant (Automotive ISV)
1yI've added 2 data entities to be processed and I'm able to see the view in SMSS but it is taking quite some time to execute the query. Do you encounter this before?
Data Analyst | BI Developer 👉 Lean Data Pipelines & Reports 📊 | 1K+ Followers
2yThanks for sharing this useful information and the additional helpful links 👍
Projectmanager @ Mercedes-Benz AG
2yThanks Ali Sanan for writing this great post. We would like to use the data entities of D365 F&O for the best possible overall architecture, of course. The description says that the connection string is mandatory for AXDB. However, this information is not available for production environments. How can the data entities be used here via the Datalake and Synapse?
Microsoft D365FO certified Dev | Shopper value| Footfall-KPI I Power Bi - Data Science learner
3yvery informative. sir is there any article on integration?
D365 SCM Techno functional Solution Architect
3yAmazing Ali Bahi, keep sharing good stuff