I have a table in a Azure SQL with three columns; ID HASH ITEM_JSON
and a cosmos db container
the goal is to perform a delete on a 'Cosmos DB' container when the hash column undergoes some change based on an id key.
For this I thought of using mapping dataflow to accomplish this task. In dataflow there is an activity called 'alter row' which, according to Microsoft documentation, allows you to perform delete operations on different sinks, including Cosmos DB.
The partition key is configured to /id in sink-settings
The problem is that when I debug dataflow it doesn't perform the corresponding deletions. What could be the problem? Any documentation is welcome
I also tried to delete the documents from cosmos db via Data Flow. but when I am selecting allow delete and running the dataflow its throwing error called 'Key column' property should be mapped for delete and update operations
means need to select key column but that option is not available when we select Allow Delete this option is available on SQL database.
You can raise feature request here
The workaround can be recreating the container with the rows which you don't want to delete.
First, I took a sample SQL table with three columns: ID HASH sample Json as a source.
Then I joined It with the table where hash column undergoes some change based on an id key.
Then I filtered the rows where Hash column has different values with expression
source2@HASH==derivedColumn1@HASH
Then I selected the Json sample column with select transformation.
Then I parse the Json column with parse transformation.
Then I selected the particular column in select transformation.
Now to transfer it into sink with the recreating container option which will delete and receate the container.
Output:
Also refer this SO thread