I have developed a data processing engine in MarkLogic to handle data being exported into another VPC (virtual private cloud). I am not going to get into why we need this particular solution, I am aware of MLCP, etc. The data has to go through an ETL and schema validation process before offloading to a third party service to handle everything else.
Here is the basic design of the engine:
exportable
collection (in the content database
) is created or modifiedexportDataTrigger.sjs
event
details:
"collection-scope": { "uri": "exportable" }
"document-content": { "update-kind": "create" }
"when": "post-commit"
event
details:
"collection-scope": { "uri": "exportable" }
"document-content": { "update-kind": "modify" }
"when": "post-commit"
export database
with the export
collection by the exportDataTrigger.sjs
moduleexport
collection
processExportData.sjs
module"collection-scope": { "uri": "export" }
"document-content": { "update-kind": "create" }
"when": "post-commit"
processExportData.sjs
module performs the following operations on the document:
export
collection with processed
collectionxdmp.documentInsert
xdmp.documentInsert(uri, processedDocument, xdmp.documentGetPermissions(uri), 'processed');
processed
collection:
validateData.sjs
module"collection-scope": { "uri": "processed" }
"document-content": { "update-kind": "modify" }
"when": "post-commit"
validateData.sjs
module performs the following operations on the document:
validated
is set to true
processed
collection with export-ready
collectionxdmp.documentInsert
xdmp.documentInsert(uri, processedDocument, xdmp.documentGetPermissions(uri), 'export-ready');
validated
is false
by defaultprocessed
collection with needs-review
collectionxdmp.documentInsert
xdmp.documentInsert(uri, processedDocument, xdmp.documentGetPermissions(uri), 'needs-review');
Everything works fine until the trigger on step #6. It gets executed twice, even though the document is only updated once. I added a lot of logs throughout the code to check each transaction. I also checked to make sure I didn't actually add the trigger twice. I currently have my q-console set up to skip straight to step #3.
What would cause a trigger to fire twice on one update?
Edit:
I decided to comment out the code logic in the validateData.sjs
module and just left the logs. It turns out it is triggering itself, an extra time. I still don't understand why, since I am committing the document once, with a new the trigger should not be fired on (either export-ready
or needs-review
)
Post-commit triggers are spawned and executed as separate transaction from the one that updated the document.
You didn't show how the triggers were created or what options were set. Was the recursive
parameter set to false
?
https://docs.marklogic.com/trgr.createTrigger
recursive
Set totrue
if the trigger should be allowed to trigger itself for recursive changes on the same document. Set tofalse
to prevent the trigger from triggering itself. If this parameter is not present, then its value istrue
.
If not, then changes made to the document could trigger the modified trigger.
Avoiding Infinite Trigger Loops (Trigger Storms)
...you can avoid trigger storms by setting the
$recursive
parameter in thetrgr.createTrigger()
function tofn:false()
.