Saturday, 29 March 2025

Introduction to Dynamics 365 CE Data Migration using ADF


Dynamics 365 CE Data Migration using ADF can be necessary for various reasons, such as archiving historical data, integrating with other systems, or performing data analysis in a different environment. However, extracting and transforming data from Dynamics 365 CE Service can be challenging due to its complex data model and security considerations.

Azure Data Factory (ADF) offers a robust solution with its Metadata-Driven Pipeline feature. This approach of Dynamics 365 CE Data Migration using ADF enables you to define your data extraction process using metadata, including source entities, data fields, and transformations.

By leveraging this Dynamics 365 CE Data Migration using ADF approach, you can:

  • Streamline data extraction: Automate the process of identifying and retrieving data from Dynamics 365 CE.
  • Improve data quality: Ensure data consistency and accuracy during the extraction process.
  • Enhance security: Implement robust security measures to protect sensitive data during migration.
  • Increase efficiency: Accelerate the data extraction process and reduce manual effort.

In this blog, we will attempt to move Case records from Dataverse to Azure Storage Account Container as a JSON blob file.

READ – Filtering Dynamics 365 CE in Canvas Apps with Option-Sets Tutorial

Steps for Dynamics 365 CE Data Migration using ADF

Step 1:

Visit your Azure Resource group and create a Data Factory resource. Once deployed, launch the studio.

Azure Resource Group
Create Data Factory Resource

Step 2:

In the ADF studio, create Linked Services for your Microsoft Dynamics 365 CRM or Dataverse D365 CE environment and the target Azure Storage Account Blob container.

Dataverse Linked Service:

Dataverse Linked Service

It is highly recommended to choose the Service Principal as the Authentication Type. Once you have entered the values for Application ID and the Secret key, test the connection.

Service Principal as Authentication Type

Azure Storage Account Linked Service:

Azure Storage Account New Linked Service
Azure Storage Account Linked Service

Step 3:

Once both the Linked Services are created, navigate back to your ADF studio, click on the Ingest option and choose metadata-driven Copy Task. From there, we will set up the Control Table.

Ingest Option
Metadata-driven Copy Task

Step 4:

Have an SQL Server and DB as prerequisite.  Now, select Type as Azure SQL Database, choose your Azure Subscription, Sever Name, Database Name & Authentication Type = System Assigned Managed Identity.

SQL Server and DB as Prerequisite

Step 5:

Here comes the most important bit. We need to configure the system assigned managed identity by running a T-SQL script in SSMS (SQL Sever Management Service) tool. The script would create the Managed Identity as a user and elevate the privileges to DB owner.

Running T-SQL Script in SSMS
CREATE USER <Managed Identity> FROM EXTERNAL PROVIDER; ALTER ROLE db_owner ADD MEMBER [Managed Identity]

Step 6:

Test connection to the Linked Service.

Test Connection to Linked Service

Step 7:

Next, specify your data source as your Dataverse Linked Service.

Specify Data Source

Step 8:

We select the first table for which we wish to fetch the data from. In our instance, we are selecting the Case (Incident) table.

Select Case (Incident) Table
Choose Loading Behavior

Step 9:

On the next step, select the Azure Blob Linked Service as Destination Data Store, specifying the target file path and name.

Destination Data Store

Step 10:

Then, we select File format as JSON, and its pattern as “Array of objects”.

File Format Settings

Step 11:

On the next step, we ensure that the system has auto-mapped the fields correctly.

Schema Mapping

Step 12:

We give a custom name to the our ADF Copy Task.

Custom Name to ADF Copy Task

Step 13:

Finally, we check the summary of this entire configuration before we click Finish.

Check the Summary

Step 14:

On the completion of deployment, we will get an SQL script generated. We execute it in the SSMS tool as a new query.

Deployment Complete
SQL Script Generated

Step 15:

After the successful execution of the script, we finish setting up the entire configuration on the Azure portal. We would see that under our pipeline, 3 tasks components are created which will drive our Integration – Top Level (Control Table), Middle Level & Bottom Level.

Setting up Configuration on Azure Portal

READ – The Ultimate Dynamics 365 CRM Guide

Unit Testing

  • We execute the Pipeline by clicking the Debug button and very for the execution of the 3 tasks.
image 28
  • We confirm that Incident JSON blob is created as expected.
image 29
  • Resultant JSON array is as follows:
image 30
  • We also add the “Subject” table to the list of tables as follows.
image 31
image 32
  • We again get an updated T-SQL script that we will execute again in SSMS tool.
image 33
image 34
  • Now our Control Table has 2 rows, 1 for Case and another row for Subject.
image 35
  • Finally, when we execute our pipeline again, we confirm that JSON blobs are created for both now – Case and Subject tables.
image 36

Conclusion – Dynamics 365 CE Data Migration using ADF

Hence, we learnt in this blog how we can implement Dynamics 365 CE Data Migration using ADF metadata-driven pipelines. The Dynamics 365 CRM consulting team appreciated how easy it is to configure for multiple Dataverse tables.

Trigger Power Automate Flow with JS in Dynamics 365 CRM

 One effective way to extend CRM functionality is by integrating Power Automate with JavaScript. In this blog, we will walk through how to call a Power Automate Flow when a button is clicked within dynamics 365 CRM. This approach is useful for real-time processes like data validation, sending emails, or updating records-based specific conditions.

WHY CALL POWER AUTOMATE FROM JAVASCRIPT?


Power Automate flow allows us to build workflows that can be connect various applications and services. While Power Automate has built in triggers and actions but you may want to trigger a flow programmatically using JavaScript. This approach will offer flexibility for customizing logic, adding automation without user needing to interact with flow manually.

For example, you might want to update a record or send an email when a user clicks a button in CRM form. With JavaScript, this can be done smoothly.


PREREQUISITES  

Before diving into steps, ensure you have.  

  • Basic understanding of Power Automate and JavaScript
  • Dynamics 365 CRM Services instance with permission to create flows and custom script. 
  • A power automated flow with Http Request trigger enabled. 

SCENARIO:

Let us consider a scenario where, you need to send an email when a button is clicked on the Case Form. This we will achieve by using JavaScript and Power Automate Flow.

STEP 1: CREATING THE POWER AUTOMATE FLOW

First, let’s create a simple power automate flow.

  1. Go to https://make.powerapps.com/ and make sure you’re in appropriate environment and create a new solution.
  2. Inside your solution, click on New à Automation à Cloud Flow à Instant
  3. Choose Instant Flow and select When HTTP request is received as the trigger. This allows the flow to be triggered automatically through HTTP request and click on Create.
  4. Define the schema for the HTTP request body that you will send from JavaScript.
  5. Copy below Json click on Use sample payload to generate schema and paste.

For Example,

{
  "type": "object",
  "properties": {
    "recordId": {
      "type": "string"
    },
    "entityName": {
      "type": "string"
    }
  }
}
image 43

6. Add your desired action to the flow. These actions could be anything, such as updating a recordsending email, or creating new records.  For our scenario we are using sending an email.

image 44

7. After configuration the flow, save it. Copy the HTTP POST URL from the trigger as you will use this in JavaScript code.

image 45

STEP 2: WRITING JAVASCRIPT TO CALL THE POWER AUTOMATE FLOW

Well now write the JavaScript that will call the power automate flow using the http request trigger.

Here the code snippet:

function callPowerAutomateFlow(primaryControl) {
  var formContext = primaryControl; 
  var recordId = formContext.data.entity.getId(); // Get the record ID
  var entityName = formContext.data.entity.getEntityName(); // Get the entity name

  // Replace with your Flow's HTTP request URL
  var flowUrl =
    "https://prod-139.westus.logic.azure.com:xx";

  var requestData = {
    recordId: recordId,
    entityName: entityName,
  };

  var req = new XMLHttpRequest();
  req.open("POST", flowUrl, true);
  req.setRequestHeader("Content-Type", "application/json");

  req.onreadystatechange = function () {
    if (req.readyState == 4 && req.status == 202) {
      Xrm.Utility.alertDialog("Flow triggered successfully."); // Success message
    } else if (req.readyState == 4) {
      Xrm.Utility.alertDialog("Error triggering the flow: " + req.statusText); // Error handling
    }
  };

  req.send(JSON.stringify(requestData)); // Send the data to the flow
}

This function does the following:

  • Captures the current record GUID and entity name.
  • Sends an HTTP POST request to Power Automate Flow URL, passing the record data.
  • Handles success or error messages, providing user feedback.

After writing the script, upload as a JavaScript Web Resource in Dynamics 365 CRM 

STEP 3: ADDING A BUTTON TO A CRM FORM

Now, let’s add a button to the Case Form that will trigger the flow when clicked.

  1. Navigate to your app such as Customer Service Hub.
  2. Click on edit and navigate to Case View and click on Edit Command Bar.
image 46
  1. In command editor click on New  Command.
  2. In action select Run JavaScript.
  3. In Library select JavaScript which we have uploaded.
  4. Add parameter as Primary Control.
image 47

By Following these steps, you can trigger a Power Automate Flow from a button click in Dynamics 365 CRM using JavaScript.

STEP 4: TESTING THE FLOW

 After setting everything up, it is essential to test whether it is working as expected.

  1. Navigate to the Customer Service Hub and open any Case record.
  2. Locate the Call Flow button you added to your form and click it.
  3. You should see an alert confirming whether the flow was triggered successfully or an error message if something went wrong.
  4. Check your Power Automate Flow’s run history to ensure the flow was executed, and confirm that the email was sent as expected.
image 48

Email sent:

image 49

By integrating Power Automate Flow with JavaScript in dynamics 365 CRM, you can streamline process, automate task, and reduce manual efforts, all triggered with a simple button click. This approach enhances the user experience and offers greater flexibility for building custom workflows in your CRM environment.

Try this out and see how it can improve your workflows!

Thursday, 20 February 2025

Portal user scope level - Dynamics 365

 In Dynamics CRM (now Dynamics 365), a "portal user scope level" refers to the level of access a user has within a portal, which can be set to "Global" (access to all data), "Contact" (access only to records related to their contact record), or "Account" (access to records associated with their parent account), depending on the entity permission configuration; essentially determining which records they can view and interact with based on their relationship to the data within the system. 

Key points about portal user scope levels:
  • Global Scope:
    A user with global scope can see all records within the system, regardless of their relationship to any specific contact or account. 
  • Contact Scope:
    A user with contact scope can only see records directly related to their own contact record, based on defined relationships within the CRM. 
  • Account Scope:
    A user with account scope can see records associated with their parent account, allowing access to related contacts and other data within that account hierarchy. 
How to manage scope levels:
  • Entity Permissions:
    Within the portal administration, you define entity permissions for each entity (like "Case" or "Account") and set the scope to either "Global", "Contact", or "Account" depending on the desired access level.
  • Web Roles:
    You assign these entity permissions to specific web roles, which are then assigned to portal users to control their access based on their role and the chosen scope. 
Example:
  • A customer service portal might use a "Contact" scope for a customer to view their own support cases, while a "Account" scope could be used for a company administrator to see all cases related to their company. 

Introduction to Dynamics 365 CE Data Migration using ADF

Dynamics 365 CE Data Migration using ADF can be necessary for various reasons, such as archiving historical data, integrating with other sys...