How to create Dashboards on your D365 FnO data in Azure Data lake using Azure Synapse Analytics SQL on-demand Serverless Pool

How to create Dashboards on your D365 FnO data in Azure Data lake using Azure Synapse Analytics SQL on-demand Serverless Pool

So you have successfully exported your FnO data in data lake using LCS azure export add-in, now what ? how we can consume our data in data lake to create beautiful dashboards ?

I will not discuss how to export tables to data lake in this article, you can follow below link if you want to know more about this:

If you open up Azure Storage Explorer and view data lake blob, it is a folder structure with mostly .csv data and .json manifest files containing metadata and your FnO data in raw form. There are different ways to consume data from this point including:

  • Synapse Analytics
  • Power bi data flows
  • Azure Databricks

I am going to use Synapse Analytics SQL Serverless pool to consume my data, and that's because I can use my T-SQL skills to write powerful code and extract data from tables, the beauty of using Synapse Serverless SQL pool is that it will convert your raw data into SQL views and you can query your tables just like usual, it makes you feel at home when you work in SQL Management Studio and allows you to comfortably write your mashup logic in shape of T-SQL views and Procedures.

So lets start the walkthrough without further delay, you will need to create a Synapse Analytics workspace in same data region which houses your data lake and ERP,

Please note that Synapse Serverless has a cost of USD $5 per TB, this will be charged against your subscription

Create Azure Synapse Workspace

  1. Goto Azure portal > Search for Azure Synapse Analytics and click first result
  2. On Synapse Analytic page, Click Create
  3. Select your data lake subscription, data lake resource group (or create new), Enter a workspace name, select REGION, MUST BE SAME AS YOUR STORAGE ACCOUNT'S REGION, in data lake account gen2, select your existing data lake account, this will be used for logs and outputs.
  4. On Security tab, enter a SQL password for sqladmin user
  5. Click Create to deploy Synapse Workspace

Download CDMUTIL Tool

The magic of converting your raw data into synapse SQL db is being done by a special tool provided by Microsoft, they call it CDMUTIL, the tool is available to download on Github.

Once you have downloaded this tool, you can deploy it in two ways,

  1. Console App
  2. Azure Function

We will use Console app method in this tutorial, extract the downloaded files and open config file CDMUtil_ConsoleApp.dll.config. You are going to need following information to be filled in this file:

  • TenantId (your Azure tenant Id,)

No alt text provided for this image

  • AccessKey (Access key of your data lake)

No alt text provided for this image

  • ManifestURL ( blob URL of lake storage table manifest file)

No alt text provided for this image

  • TargetDbConnectionString (your Azure Synapse DB connection string, )

No alt text provided for this image

Syntax of connection string will be as follows:

<add key="TargetDbConnectionString" value="Server=xyzsynapse-ondemand.sql.azuresynapse.net;Initial Catalog=xyzDB;User ID=sqladminuser;Password=xyzPassword" />        

  • AXDBConnectionString (Optional, your local DEV D365 DB connection string, this is used in case you want to create data entities as views)

If you choose to process some data entities also, two more items you will need to take care:

  • Add <add key="ProcessEntities" value ="true"/> already provided in file, Uncomment it.
  • Open Manifest/EntityList.json file and enter entity names, comment out rest of entities, something like this:

No alt text provided for this image

Execute CDMUTIL

Once your config file is setup, its time to fire up the tool, open a command prompt with admin privileges, locate the CDMUTIL folder and type CDMUtil_ConsoleApp.exe, this will start the tool and you will see a screen similar to this:

No alt text provided for this image

After the tool has been executed successfully, you are ready to explore your data lake meta data in SQL management studio or Azure Data studio, Although Data studio is more compatible, I personally prefer SQL Management Studio.

Enter your Serverless endpoint url in Server connection and authenticate using Azure active directory. Expand your database and expand Views Node:

No alt text provided for this image

You have all tables from your data lake enlisted as views, now you can create your own views and write queries to be used in visualization tools like Power bi.

Powerbi

Open powerbi, click Get Data, search for Azure Synapse Analytics SQL data source and click Connect, Enter your Serverless endpoint in Server, database name(optional) in database, and choose Import as query mode, you will see your tables under dbo node in power bi navigator, select your required tables or views.

No alt text provided for this image

Following is an example of dashboard I built in Power bi:

No alt text provided for this image

Microsoft has come a long way fine tuning their BI reporting options for cloud based ERP, finally we can see an option where we can focus on building actual reports instead of integrations and data sources, it is almost seamless now (Near realtime feature is still in public preview at the time of writing this post).

Some Tips:

  • Keep a copy of all DDLs(views etc) you perform on Synapse database
  • If a table is not showing up in Synapse DB Views node, try to deactivate it in Dynamics and Activate again.
  • Keep backup of app config file of CDMUTIL tool






John Loh

D365 Finance & Operations Technical Consultant; A365 Technical Consultant (Automotive ISV)

1y

I've added 2 data entities to be processed and I'm able to see the view in SMSS but it is taking quite some time to execute the query. Do you encounter this before?

Like
Reply
Ener Hagi

Data Analyst | BI Developer 👉 Lean Data Pipelines & Reports 📊 | 1K+ Followers

2y

Thanks for sharing this useful information and the additional helpful links 👍

Patrick Schwalm

Projectmanager @ Mercedes-Benz AG

2y

Thanks Ali Sanan for writing this great post. We would like to use the data entities of D365 F&O for the best possible overall architecture, of course. The description says that the connection string is mandatory for AXDB. However, this information is not available for production environments. How can the data entities be used here via the Datalake and Synapse?

Like
Reply
Abrar Ahmad

Microsoft D365FO certified Dev | Shopper value| Footfall-KPI I Power Bi - Data Science learner

3y

very informative. sir is there any article on integration?

Ali Raza Zaidi

D365 SCM Techno functional Solution Architect

3y

Amazing Ali Bahi, keep sharing good stuff

To view or add a comment, sign in

More articles by Ali Sanan

Insights from the community

Others also viewed

Explore topics