Skip to content

Accessing the “before” values of Dataverse rows, part 2: custom actions

In my earlier article, we went through how you can access the “before” state of Dataverse records after they are edited or deleted, by using classic workflows in Power Platform.

In that example, deletions and udpates of rows were handled with a Power Automate cloud flow. The flow needed to know the state of the row before the change/deletion took place. This solutiona architecture required the use of a custom log table in Dataverse.

This time we’re going to look at how to achieve the same result, but without a custom log table. The key to this is leveraging the custom actions feature in Dataverse. We’ll also be using classic workflows and a Power Automate cloud flow as part of the solution.

Let’s get started!

Example – Device settings

Our example Power App is used for managing devices and device configurations. One configuration contains several configuration items.

A device is always related to exactly one configuration.

The same configuration can be used for several different devices.

So our data model looks like this.

When editing the configuration, a new configuration must be sent to each device that uses that setting.

This is not a problem when you add new configuration items to the settings or edit existing ones. But what if an item is deleted from the configuration? How do we find out the configurations the item was removed from? This is crucial as we need that information to update all devices that use that setting.

The flow starting from the deletion of a configuration item row tells nothing more than the id of the deleted row. This is understandable because the row is actually already deleted when flow is executed asynchronously after the event.

So we need something else. Something that can run synchrohously as part of the deletion event and execute our required business logic.

Creating the Custom Action

Let’s start by creating a new action in the Power Apps maker portal. Actions are like classic workflows, but they are not automatically triggered when a row is added/modified/deleted. They are triggered elsewhere as needed. Typically from a classic workflow.

Let’s give the action a name and define it to be independent of any table (Add to = None).

Our action doesn’t actually do anything this time. It just takes in the configuration to which the deleted row was associated as a parameter.

Let’s add a new parameter (1) and give it a name (2). Define the parameter type (3) as a reference to the table row (EntityReference) and define the target of the reference as the right table (4).

The parameter direction is input.

Ready. Let’s save and activate.

Classic Workflow

So what will trigger this action? Classic workflow. Let’s create one.

Uncheck “Run Workflow in the background”. This way, our workflow starts immediately and we have the opportunity to see the content of the record before it is deleted. It will be what is know as a real-time workflow, running synchronously inside the original transaction.

The workflow starts when any user deletes (2) the configuration item (Scope  =  Organization , 1). It is executed before (3) the row is actually deleted.

The workflow does nothing but perform an action.

Let’s select the action we just did from the list and move on to defining the parameters to be passed to it (Set Properties).

Our action has one parameter (updatedTemplate). Press Add (1), OK (2) and Save and Close (3).

Finally, save the workflow and activate it.


Finally, the actual magic. A new flow is created, which starts every time an action is performed in Dataverse.

Select the action from which the flow starts (tp_Configurationitemisdeleted).

When the action is executed, the flow starts and there you can find the identifier of the configuration passed as a parameter (updatedTemplate/Id ).

We can take advantage of this and list all the devices that use that configuration from which the line was just removed.

Filter rows = _tp_configuration_value eq triggerOutputs()?['body/InputParameters/updatedTemplate/Id']

Finally, we go through these devices one by one and we can send the configuration to be used with the device. In the example, however, we don’t do that, but settle for the compose command.


We have created a flow that can access not only the identifier but also other data of the deleted Dataverse row. 

For this, in addition to a Power Automate cloud flow, we needed one action and one classic workflow. This is because the modern cloud flows aren’t able to run synchronously when events take place in Dataverse. The classic worflows inherited from the Dynamics CRM / Dynamics 365 platform, however, support real-time execution as part of the database transaction.

With the help of the action, we pass the necessary information from the classic workflow to flow. This way, we don’t need to create and manage the custom Dataverse log table that we used in my earlier post to transmit data. Such a table is in a way redundant, as Dataverse contains a built-in auditing feature – although it’s not accessible to our custom business logic.

We could also apply our solution to the events when Dataverse rows are edited. What was the column value of the row before editing? This can be useful information in many cases. For example, when you need to enforce specific rules on allowed values of specific fields on a Power Apps form once it is submitted.

ActionsClassic workflowsDataversePower Automate

Leave a Reply

Your email address will not be published. Required fields are marked *