finance and operations data.
About the Author: Manish Luhana
Microsoft business applications specialist and certified trainer.
Categories: Dynamics 365, Finance and Operations0 CommentsPublished On: 5 December 2021

You chose Dynamics 365 for your company because you want to get more visibility and use insights to alter your business. Is it difficult to deliver timely insight? Is it too time-consuming to create and maintain complicated data pipelines?

If your solution contains a data lake, you can now link your Finance and Operations apps environment to a data lake to simplify data pipelines and unlock the insights hidden in that data. Data from your Dynamics 365 environment is now available in Microsoft Azure Lake, thanks to the general availability of the Export to Data Lake capability in Finance and Operations apps.

For big data analytics, data lakes are ideal. You may use Microsoft Power BI to build powerful operational reports and analytics using a replica of your Dynamics 365 data in the data lake. Your data engineers can modify data or apply machine learning models using Spark and other big data technologies. You can also use a data lake in the same way that you would a SQL database. The Azure Synapse Analytics serverless SQL pool endpoints make it simple to query huge data in the lake using Transact-SQL (T-SQL).

Why should you limit yourself to business data? At a fraction of the expense of keeping data in a SQL data warehouse, you can ingest legacy data from prior systems, as well as data from machines and sensors. With Azure Synapse Analytics, you can quickly combine business data with inputs from sensors and machines. Signals from the manufacturing floor can be combined with production timetables, and online logs from e-commerce sites can be combined with bills and inventory movement.

Can’t wait to put this feature to the test? Here are the measures you must take.

Install the Data Lake Export feature

The Export to Data Lake feature is an optional add-on that comes with your Dynamics 365 subscription. This feature is normally accessible in the following Azure regions: US, Canada, UK, Europe, Southeast Asia, East Asia, Australia, India, and Japan. You can enable this feature in your Finance and Operations apps environment if it is located in one of those locations.

To use this feature, you must first link your Finance and Operations apps environment to an Azure Data Lake and give permission to export and use the data.

To enable the Export to Data Lake feature, go to the Microsoft Dynamics Lifecycle Services portal and pick the environment where you want to authorize this feature.  You must additionally provide the location of your data lake when using the Export to Data Lake add-in. If you haven’t already done so, follow the instructions in the Install Export to Azure Data Lake add-in to establish one in your Azure subscription.

Choose which data to send to a data lake

After the add-in is installed, you and your power users can launch the Finance and Operations app environment and select data to be exported to a data lake. Standard or customized tables and entities are available. When you select an entity, the system selects all of the underlying tables that make up the entity, so you don’t have to select each table individually.

It’s as simple as that. Your users can consume data from the data lake, and the system keeps the data fresh. On the screen, you can view the status of the exports, as well as the last time they were refreshed.

In the lake, work with data

Within the data lake, you’ll notice that the data is organized into a hierarchical folder structure. The data is organized by application area, then by module. A breakdown by table type is also available. The lake’s extensive folder structure makes it simple to organize and secure your data.

The data is stored in CSV files in each data folder. As finance and operations data changes, the files are updated in place. The folders also contain metadata that is organized according to the Common Data Model metadata standard. This makes it simple for Azure Synapse, Power BI, and other third-party applications to ingest the data.

If you wish to work with data in Azure Data Lake using T-SQL, as if you were reading data from a SQL database, you can use the CDMUtil tool, which is accessible on GitHub. An Azure Synapse database can be created with this tool. As though you were reading from a SQL database, you can query the Synapse database using T-SQL, Spark, or Synapse pipelines.

By combining data from many sources, you can turn the data lake into a massive data warehouse. To aggregate the data, you can use SQL or Spark. Pipelines with sophisticated transformations can also be created. Then, from within Azure Synapse, you can build Power BI reports. Select the database and construct a Power BI dataset in a single step. Your users can use Power BI to open this dataset and produce advanced reports.

tech mentor sidebanner
  • Continue reading
  • Continue reading
  • Continue reading
  • Continue reading
  • Continue reading

Leave A Comment