In Swift, we can leverage DispatchQueue
to prevent race condition. By using serial queue, all things are performed in order, from https://developer.apple.com/library/content/documentation/General/Conceptual/ConcurrencyProgrammingGuide/OperationQueues/OperationQueues.html
Serial queues (also known as private dispatch queues) execute one task at a time in the order in which they are added to the queue. The currently executing task runs on a distinct thread (which can vary from task to task) that is managed by the dispatch queue. Serial queues are often used to synchronize access to a specific resource.
But we can easily create deadlock How do I create a deadlock in Grand Central Dispatch? by perform a sync
inside async
let serialQueue = DispatchQueue(label: "Cache.Storage.SerialQueue")
serialQueue.async {
serialQueue.sync {
print("perform some job")
}
print("this can't be reached")
}
The only way to prevent deadlock is to use 2 serial queues, each for sync
and async
function versions. But this can cause rare condition when writeSync
and writeAsync
happens at the same time.
I see in fs module that it supports both sync
and async
functions, like fs.writeFileSync(file, data[, options])
and fs.writeFile(file, data[, options], callback)
. By allowing both 2 versions, it means users can use them in any order they want? So they can easily create deadlock like what we did above?
So maybe fs
has a clever way that we can apply to Swift? How do we support both sync
and async
in a thread safe manner?
serialQueue.async {
serialQueue.sync {
print("perform some job")
}
}
This deadlocks because this code queues a second task on the same dispatch queue and then waits for that second task to finish. The second task can't even start, however, because it is a serial queue and the first task is still executing (albeit blocked on an internal sempahore).
The way to avoid this kind of deadlock is to never do that. It's especially stupid when you consider that you can achieve the same effect with the following:
serialQueue.async {
print("perform some job")
}
There are some use-cases for running synchronous tasks in a different queue to the one you are in e.g.
however, there is never a reason to synchronously do something on the same queue, you might as well just do the something. Or to put it another way, if you just write statements one after the other, they are already executing synchronously on the same queue.
I see in fs module that it supports both sync and async functions, like fs.writeFileSync(file, data[, options]) and fs.writeFile(file, data[, options], callback). By allowing both 2 versions, it means users can use them in any order they want? So they can easily create deadlock like what we did above?
That depends on how the two APIs are implemented. The synchronous version of the call might just do the call without messing about on other threads. If it does grab another thread and then wait around until that other thread is finished, then yes there is a potential for deadlock if the node.js
server runs out of threads.